sers of JSON functionality (if any!) should be aware of a potential
problem in the way L<JSON::XS|JSON::XS> encodes numbers. The problem
basically is that the locale leaks into the encoded JSON, and i
f the
locale uses commas for decimal points the encoded JSON can not be
decoded. As I understand the discussion on the associated Perl ticket
the problem has always been there, but changes introduced
ture of the JSON interface is such that I have no
control over the issue, since the workaround needs to be applied at the
point the JSON C<encode()> method is called. See test F<t/tle_json.t>
for the
REST (or Version 2) interface to the Space
Track web site, a third representation of TLE data, as JSON, has become
available.
Given a chunk of TLE data, the
L<Astro::Coord::ECI::TLE|Astro::Coord::ECI
have installed that module from its own distribution. If
your data are in JSON format, you will need the optional L<JSON|JSON>
module installed.
In practice, TLE data have an effective date, and a s
date2jd
decode_space_track_json_time deg2rad distsq dynamical_delta
embodies epoch2datetime find_first_true
fold_case __format_epoch_time_usec
format_space_track_json_time gm_strftime intensity_t
+ ($sec || 0);
}
=item $time = decode_space_track_json_time( $string )
This subroutine decodes a time in the format Space Track uses in their
JSON code. This is ISO-8601-ish, but with a possible fr
actional part and
no zone.
=cut
sub decode_space_track_json_time {
my ( $string ) = @_;
$string =~ m{ \A \s*
( [0-9]+ ) [^0-9]+ ( [0-9]+ ) [^0-9]+ ( [0-9]+ ) [^0-9]+
( [0-9]+ ) [^0-9]+ ( [