diku-dk / alpacc

MIT License
5 stars 0 forks source link

Lexer optimization #7

Closed WilliamDue closed 8 months ago

WilliamDue commented 8 months ago

The lexer is now faster and uses less memory. A LISP lexer used to take 1.4 seconds to tokenize a 1GiB file with the OpenCL backend and now it takes 0.3 seconds.

Optimizations: