The problem is that tools now can't load normal JSON files, because they expect each line to contain a valid JSON object. So a file that starts [\n will trigger an error like this:
*** json.decoder.JSONDecodeError: Expecting value: line 2 column 1 (char 2)
Very bad.
I want to support both whole files and jq-style streams, so it seems we need to do some kind of autodetection. This could be as simple as trying 2 loaders in sequence by default. We would need a flag to avoid the autodetection.
This could also provide a way to reintroduce YAML support, as the performance issues with large YAML files will go away if we have a way to tell the parser to treat it as JSON.
It could even provide a way to remove the cpe import command; but I feel like this is inviting too much complexity in the core. The core can deal with multiple serialization formats but not different data formats.
I made a mistake when I changed the input format to be JSON only.
The problem is that tools now can't load normal JSON files, because they expect each line to contain a valid JSON object. So a file that starts
[\n
will trigger an error like this:Very bad.
I want to support both whole files and jq-style streams, so it seems we need to do some kind of autodetection. This could be as simple as trying 2 loaders in sequence by default. We would need a flag to avoid the autodetection.
This could also provide a way to reintroduce YAML support, as the performance issues with large YAML files will go away if we have a way to tell the parser to treat it as JSON.
It could even provide a way to remove the
cpe import
command; but I feel like this is inviting too much complexity in the core. The core can deal with multiple serialization formats but not different data formats.