Of course you have access to Token and Type too so you can do what you want with the token stream, including turning bytes into base64 strings kind of like the Go json encoder does transparently 🤮.
class Base64Tokeniser extends Tokenizer {
next () {
const nextToken = super.next()
if (nextToken.type === Type.bytes) {
return new Token(Type.string, Buffer.from(nextToken.value).toString('base64'))
}
return nextToken
}
}
Plus you could do the reverse too, on encode since the json encoder here can be overridden in various ways to perform transformations, like we use for dag-json. Whatever floats your boat. At least with access to the token stream you can halt before you waste time processing something big that's going to error later.
Hopefully enough flexibility here to do creative things without making too many holes in your foot in the process.
Closes: https://github.com/rvagg/cborg/issues/110
What do you reckon @achingbrain?
Here's my full quick example work-up, part of which I included in the README:
Of course you have access to
Token
andType
too so you can do what you want with the token stream, including turning bytes into base64 strings kind of like the Go json encoder does transparently 🤮.Plus you could do the reverse too, on encode since the json encoder here can be overridden in various ways to perform transformations, like we use for dag-json. Whatever floats your boat. At least with access to the token stream you can halt before you waste time processing something big that's going to error later.
Hopefully enough flexibility here to do creative things without making too many holes in your foot in the process.