kach / nearley

📜🔜🌲 Simple, fast, powerful parser toolkit for JavaScript.
https://nearley.js.org
MIT License
3.59k stars 232 forks source link

Question: parsing large files #158

Closed bapti closed 7 years ago

bapti commented 7 years ago

Hi,

This is purely a question - if you don't have time to answer then no worries. I'm building a Gedcom parser to teach myself how to use Nearley (fantastic docs btw!) Some gedcom files can be massive >20Mb. I was wondering whether Nearley will be able to handle this or if there's methods for chunking and streaming the data through the parser?

Thanks Neil

kach commented 7 years ago

Nearley is streaming by design: you can feed it chunks by calling .feed() repeatedly with chunks of data. For large files, I recommend using a tokenizer for performance reasons.

bapti commented 7 years ago

Thanks a million @Hardmath123 - if I manage to finish my parser I'll link back to you as an example