Open janjanusek opened 4 hours ago
The architecture is LlamaForCausalLM
and that is supported by the model builder. You can use the model builder to get the ONNX model and run the generation loop with ONNX Runtime GenAI.
The CTC alignment for the output tokens would have to be done outside of ONNX Runtime GenAI as the support is not currently there. If you run into any ONNX Runtime GenAI tokenizer issues with OuteTTS
, you can add them here and use Hugging Face's tokenizer instead. Please feel free to open a PR to contribute as well (e.g. to add support for CTC alignment in ONNX Runtime GenAI).
https://huggingface.co/OuteAI/OuteTTS-0.1-350M
Is it possible to use gennai for this model? it's llama3 based but slightly different 🤷🏼♂️ ca we add support?