no-context / moo

Optimised tokenizer/lexer generator! 🐄 Uses /y for performance. Moo.
BSD 3-Clause "New" or "Revised" License
821 stars 65 forks source link

Remove feed() and add remaining() #44

Closed tjvr closed 7 years ago

tjvr commented 7 years ago

Closes #43.

nathan commented 7 years ago

What is remaining() good for?

tjvr commented 7 years ago

It's supposed to help implement the streaming strategies in #43. But now you mention it, I suppose you can workaround this by taking token.index + token.size, and keeping a copy of the buffer yourself.

nathan commented 7 years ago

So if you're using streams you'd now do the following?

const lexer = moo.compile({ … })

let state
fs.createReadStream(INPUT)
.pipe(split(…))
.pipe(new Transform({
  readableObjectMode: true,
  transform(chunk, _, cb) {
    lexer.reset(chunk.toString(), state)
    for (const t of lexer) this.push(t)
    state = lexer.save()
    cb()
  },
})

EDIT: state

tjvr commented 7 years ago

Yes, although you need to save() state between chunks and restore it.