Open sundeepChandhoke opened 2 months ago
I do not have any Intel GPU on this machine. So maybe it is expecting a GPU? I am trying to test AMX instructions on the 4th gen Xeon processor
Hi @sundeepChandhoke , yes ollama is expecting a gpu and it does not support running on a device without any gpu currently.
I am getting this error after installing the IPEX_LLM and onAPI. I can successfully serve ollama and pull models. But when I try to run any model, I get this error.