Closed bapti closed 7 years ago
Nearley is streaming by design: you can feed it chunks by calling .feed()
repeatedly with chunks of data. For large files, I recommend using a tokenizer for performance reasons.
Thanks a million @Hardmath123 - if I manage to finish my parser I'll link back to you as an example
Hi,
This is purely a question - if you don't have time to answer then no worries. I'm building a Gedcom parser to teach myself how to use Nearley (fantastic docs btw!) Some gedcom files can be massive >20Mb. I was wondering whether Nearley will be able to handle this or if there's methods for chunking and streaming the data through the parser?
Thanks Neil