Closed mouse33333 closed 3 months ago
+1. Please update the supported version of Ollama. The current version is 0.2.7, however, Ipex-LLM only supports up to 0.1.39."
Also love this idea, enjoying running models locally and would love to play with deepseekv2 and gemma2
Hi all, Gemma2 is supported with ipex-llm[cpp] >= 2.1.0b20240724
, you may try it again : )
Hi all, Gemma2 is supported with
ipex-llm[cpp] >= 2.1.0b20240724
, you may try it again : )
It works now! Thank you for your support!
Hi
I can run Llama3, Qwen or other models on Ollama with ipex-llm. However, I cannot run Gemm2 on Ollama with ipex-llm. It shows unknown model architecture: 'gemma2'. Does it mean this model is not supported by ipex-llma?
Here're the details:
Thank you for your help in advance~