phidatahq / phidata

Build AI Agents with memory, knowledge, tools and reasoning. Chat with them using a beautiful Agent UI.
https://docs.phidata.com
Mozilla Public License 2.0
14.52k stars 2.03k forks source link

llama3.1:8b #1103

Closed MyraBaba closed 3 weeks ago

MyraBaba commented 2 months ago

Hi,

How to set model to llama3.1:8b for Local Rag ?

I cant find a convenient way to do this

WilliamEspegren commented 2 months ago

Hey @MyraBaba, here is how you can run local Ollama models with Phidata :)

MyraBaba commented 2 months ago

@WilliamEspegren

(venvPhiData) redel@RedElephant:~/Projects/phidata$ python cookbook/llms/ollama/assistant.py ⠋ Working... Traceback (most recent call last): File "/home/redel/Projects/phidata/cookbook/llms/ollama/assistant.py", line 9, in assistant.print_response("Share a quick healthy breakfast recipe.", markdown=True) File "/home/redel/Projects/phidata/venvPhiData/lib/python3.10/site-packages/phi/assistant/assistant.py", line 1473, in print_response for resp in self.run(message=message, messages=messages, stream=True, **kwargs): File "/home/redel/Projects/phidata/venvPhiData/lib/python3.10/site-packages/phi/assistant/assistant.py", line 891, in _run for response_chunk in self.llm.response_stream(messages=llm_messages): File "/home/redel/Projects/phidata/venvPhiData/lib/python3.10/site-packages/phi/llm/ollama/chat.py", line 271, in response_stream for response in self.invoke_stream(messages=messages): File "/home/redel/Projects/phidata/venvPhiData/lib/python3.10/site-packages/phi/llm/ollama/chat.py", line 96, in invoke_stream yield from self.client.chat( File "/home/redel/Projects/phidata/venvPhiData/lib/python3.10/site-packages/ollama/_client.py", line 84, in _stream raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError: model "llama3" not found, try pulling it first

I am serving ollama at local llama3.1:8b but above errors given.

WilliamEspegren commented 2 months ago

@MyraBaba have you pulled it by running 'ollama run llama3.1'?

MyraBaba commented 2 months ago

ollama run llama3.1:8b is running

On 21 Aug 2024, at 21:39, William Espegren @.***> wrote:

@MyraBaba https://github.com/MyraBaba have you pulled it by running 'ollama run llama3.1'?

— Reply to this email directly, view it on GitHub https://github.com/phidatahq/phidata/issues/1103#issuecomment-2302725769, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEFRZH63AES65SGNIQPMASLZSTNFJAVCNFSM6AAAAABM3X2JLCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMBSG4ZDKNZWHE. You are receiving this because you were mentioned.

WilliamEspegren commented 2 months ago

@MyraBaba can you share your code?

reinside commented 1 month ago

Hi,

How to set model to llama3.1:8b for Local Rag ?

I cant find a convenient way to do this

Hey,

Just edit the file, adding "llama3.1" in the models list: cookbook/llms/ollama/rag/app.py