turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.53k stars 272 forks source link

Jamba support #392

Closed theyunt closed 5 months ago

theyunt commented 6 months ago

any eta for jamba-v0.1 support?

turboderp commented 5 months ago

Yeah, sry, I never got around to answering. :)

It's probably not in the cards right now. It's a very big departure from the other supported architectures, and while there are so many transformer models coming out (Llama3 next week apparently) I just can't make Jamba a priority.