Open mattdangerw opened 2 months ago
Looked at this a bit. I think #1861 will be a important precursor work to make implementing this reasonable. I also think we might want to consider starting on some of #1862 to add support, for example, for passing a custom token position inputs, but I doubt it's a strict requirement.
It's unclear to me whether Jetstream supports tokenization beyond sentencepiece and gpt-style-bpe, see this for maxtext. This is something to look into.
JetStream is throughput optimized generation software for JAX and TPUs. We should add support for JetStream generation for all KerasHub generative models.
The most useful starting reference for us will be the max engine implementation here.