This was motivated by noticing that as I was adding states to a reasonably sized program, it got slower and slower until the browser refused to allocate it any more memory and would crash the tab. The machine in question was only up to 14 states or so and about 70 lines long with several comments.
Tokenizes integers, signal names, and whitespace into single chunks now resulting in much smaller parsing tables.
See: https://nearley.js.org/docs/tokenizers
This was motivated by noticing that as I was adding states to a reasonably sized program, it got slower and slower until the browser refused to allocate it any more memory and would crash the tab. The machine in question was only up to 14 states or so and about 70 lines long with several comments.