Closed MyraBaba closed 3 weeks ago
Hey @MyraBaba, here is how you can run local Ollama models with Phidata :)
@WilliamEspegren
(venvPhiData) redel@RedElephant:~/Projects/phidata$ python cookbook/llms/ollama/assistant.py
⠋ Working...
Traceback (most recent call last):
File "/home/redel/Projects/phidata/cookbook/llms/ollama/assistant.py", line 9, in
I am serving ollama at local llama3.1:8b but above errors given.
@MyraBaba have you pulled it by running 'ollama run llama3.1'?
ollama run llama3.1:8b is running
On 21 Aug 2024, at 21:39, William Espegren @.***> wrote:
@MyraBaba https://github.com/MyraBaba have you pulled it by running 'ollama run llama3.1'?
— Reply to this email directly, view it on GitHub https://github.com/phidatahq/phidata/issues/1103#issuecomment-2302725769, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEFRZH63AES65SGNIQPMASLZSTNFJAVCNFSM6AAAAABM3X2JLCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMBSG4ZDKNZWHE. You are receiving this because you were mentioned.
@MyraBaba can you share your code?
Hi,
How to set model to llama3.1:8b for Local Rag ?
I cant find a convenient way to do this
Hey,
Just edit the file, adding "llama3.1" in the models list:
cookbook/llms/ollama/rag/app.py
Hi,
How to set model to llama3.1:8b for Local Rag ?
I cant find a convenient way to do this