Hi
I am trying to process a very large (1.8GB) XML file, with an XSLT to JSON style sheet.
The process works fine with up to 5K records, however when I try to feed the full file with ~180K records - the system runs out of memory (even when I specify -Xmx=4096m)
I have split the XML file into 5K record chunks using an excellent https://github.com/remuslazar/node-xmlsplit package. Now I am able to process all 35 (or so) chunks one time - great!
However at the end of the transformation my node.js process would be sitting at around 4 GB in memory size. When I try the transformation again (it's supposed to be a daily process) - it will run out of memory.
So my question is - is there a way to release all memory resources after I am done with my XSLT processing?
Right now I am using a pretty ugly workaround - calling process.exit() and PM2 restarts my program. However I would like to see if there is a more civilized approach.
Hi I am trying to process a very large (1.8GB) XML file, with an XSLT to JSON style sheet. The process works fine with up to 5K records, however when I try to feed the full file with ~180K records - the system runs out of memory (even when I specify -Xmx=4096m)
I have split the XML file into 5K record chunks using an excellent https://github.com/remuslazar/node-xmlsplit package. Now I am able to process all 35 (or so) chunks one time - great! However at the end of the transformation my node.js process would be sitting at around 4 GB in memory size. When I try the transformation again (it's supposed to be a daily process) - it will run out of memory.
So my question is - is there a way to release all memory resources after I am done with my XSLT processing?
Right now I am using a pretty ugly workaround - calling process.exit() and PM2 restarts my program. However I would like to see if there is a more civilized approach.