Closed nahidalam closed 1 month ago
I tried testing this basic script below but it throws error that model llama2 is not found
import ollama response = ollama.chat(model='llama2', messages=[ { 'role': 'user', 'content': 'Why is the sky blue?', }, ]) print(response['message']['content'])
raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError: model 'llama2' not found, try pulling it first
If I do model='llama3', it works fine
model='llama3'
Try to pull the llama2 model first from the ollama library:
In the shell:
ollama pull llama2
Or in Python:
import ollama ollama.pull('llama2')
I tried testing this basic script below but it throws error that model llama2 is not found
If I do
model='llama3'
, it works fine