likelovewant / ollama-for-amd

Get up and running with Llama 3, Mistral, Gemma, and other large language models.by adding more amd gpu support.
https://ollama.com
MIT License
135 stars 13 forks source link

Intel Mac AMD GPU #23

Open TomDev234 opened 3 days ago

TomDev234 commented 3 days ago

Can you add support for Intel Macs with Metal3 AMD GPUs?

likelovewant commented 3 days ago

This fork depends on upstream ollama code, only add a few lines to support gpu list for windows as the official can not make it happen duo to lack of rocm libs for windows . while the linux and Mac may have a possible solution from upsteam ollama . if you can not utilized AMD GPUS on Macs via upstream ollama , you may refer to the some solution available at https://github.com/ollama/ollama/issues/1016 or simply build from source. llama.cpp, have seens bunch of success cases if you can not make it run via ollama darwin release.