Closed Bascy closed 1 year ago
I've added some logging of freeheap to the whole parsing process and this is what I found:
To load the test document from the deserialize.ino
example, which contains 6 properties, total length 91 characters, at its peak (after calling yaml_parser_load_nodes
in loader.c
) the parser has reduced freeheap with 7216 bytes. The resulting json document is 127 bytes
To load my document (397 properties, 9277 bytes) at the peak the parser has reduced freeheap with 129420 bytes. The resulting Json document is 9020 bytes
I don't think this is ever going to work with large applications running on i.e. a ESP32 board ...
Or am I doing something wrong and is it possible to have the parser create the Json document in parallel with reading the yaml file, so it doesnt need to create a huge node tree in memory before calls deserializeYml_JsonObject()
?
I created my own Builder imlementation that creates the json document without using soo much memory, so this issue can be closed
We are developing a very large project base on ESP32 and currently in the process of switching from json based config file to YAML format because of readability. We are using a YAML file from around 9000 characters.
The YAML file is read from SPIFFS and provided as a Stream, The code that parses the YAML file, allocates a DynamicJsonDocument of 20000 characters and then calls
deserializeYml(jsonDocument, stream)
. We also have ArduinoJson in our project. Right before callingdeserializeYml()
theESP.getMaxAllocHeap()
returns around 77.000 and still deserializing results in a "Not enough memory" error.I've tried enabling debug or verbose level logging, but that doesn't give me any more logging then the "Not enough memory" error message
Why does it take more than 77kB to parse a 10.000 byte stream? I would think that te purpose of a stream is to not have all the data in memory at once
If I comment out the "currentcontoller" map at the bottom of the yaml file, the parsing does work ...