xNul / code-llama-for-vscode

Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
MIT License
548 stars 30 forks source link

Is this says codellama run only on CPU with this project? #11

Closed bonuschild closed 2 months ago

bonuschild commented 11 months ago

I saw that you mock llama.cpp but I still have gpu resources, event I also have enough cpu & RAM.

Just want to figure out it's right scene to deploying.

xNul commented 2 months ago

Code Llama runs on the GPU if you setup the codellama repository correctly. You have to install pytorch w/ CUDA support.

I'll close this issue since it has been so long and this probably answers your question. Let me know if you'd like to reopen it.