Closed darksidelemm closed 1 year ago
@darksidelemm could we initially make our parser just handle $$CALLSIGN,sentence_id,time,latitude,longitude,altitude
and then any additional fields get yeeted into ukas.field1,ukas.field2,ukas.field3
. Naturally document in the wiki that this is the limitation of the system for the moment.
Not sure how to handle the checksum for this either - I guess its not handled on the client side so we would need to do it which means we'd need to look up a flightdoc to see the checksum type? Alternatively we could just enforce CRC16_CCITT
?
Finally do you have details as to how its actually submitted? is it a HTTP POST request per line kind of thing?
Given the complexity in the current Habitat payload / flight document system, it would be a bit silly to try and replicate it completely here. I think it would be best if the relevant parts were excised from Habitat, and hosted somewhere appropriate. The telemetry parser from habitat could the push data into the sondehub-amateur DB.
UKHAS strings are of the form:
Refer https://ukhas.org.uk/communication:protocol for details.
These strings are used by all RTTY payloads (decoded mainly with dl-fldigi, but some other decoders exist), and LoRa payloads, and so would be good to have some support for.
While the vast majority of sentences have the common format starting with:
$$CALLSIGN,sentence_id,time,latitude,longitude,altitude
, some use different time formats, which can add complexity. Also, users can add all sorts of custom fields, with different data types. The Habitat database uses a 'payload document' which describes the fields present in the sentence, and how to interpret them. Payload documents are generated here: http://habitat.habhub.org/genpayload/You can see a live view of the output of the Habitat parsers here: http://habitat.habhub.org/logtail/
Unsure of the best way to proceed with this. It doesn't really make sense to duplicate the entirety of the Habitat payload document and parser system.