mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
23.2k stars 1.76k forks source link

feat: support gptneox and falcon to the llama.cpp backend #1009

Closed mudler closed 10 months ago

mudler commented 1 year ago

Falcon and gptneox seems to be supported in llama.cpp with a separate example - would be really cool to merge it in the binding and make it transparent for the user (to either load falcon, or gptneox depending on the model): https://github.com/ggerganov/llama.cpp/tree/master/examples/gptneox-wip

yugeeklab commented 11 months ago

Hi there! I've been following your project with great interest, and I'd love to contribute by tackling this issue. Is it okay if I take it on?

mudler commented 11 months ago

Sure thing, please go ahead!

mudler commented 10 months ago

this should be already working - closing for now