ollama / ollama

Get up and running with Llama 3.2, Mistral, Gemma 2, and other large language models.
https://ollama.com
MIT License
93.65k stars 7.4k forks source link

DeciLM-7B Support #1714

Open kristianpaul opened 9 months ago

kristianpaul commented 9 months ago

This was mentioned in Last Week in AI podcast, worth checking in using via ollama

“ DeciLM-7B is a 7.04 billion parameter decoder-only text generation model, released under the Apache 2.0 license. At the time of release, DeciLM-7B is the top-performing 7B base language model on the Open LLM Leaderboard”

https://huggingface.co/Deci/DeciLM-7B

easp commented 9 months ago

Ollama uses llama.cpp to run models. It looks like there has been talk of adding support for DeciLM's model architecture in Lllama.cpp but it doesn't seem like anything has come of it, yet. https://github.com/ggerganov/llama.cpp/issues/3208

avilum commented 9 months ago

Hey guys, we uploaded the GGUF version of the model in fp32, fp16 and q8_0 to this model card: https://huggingface.co/Deci/DeciLM-7B-instruct-GGUF