Open YuriyGavrilov opened 2 weeks ago
vllm support encoder-decoder models, but not flan-t5 afaict.
tbh SLM are a lot better now and outperform t5, such as phi3, and gemma would suffice
It seems T5 are supported by llamacpp. https://github.com/ggerganov/llama.cpp/pull/8141 @aarnphm Any idea?
I mean we can add support for t5,
my feeling is for encoder-decoder models there must be better option?
Just about to add this models to list.