ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.67k stars 220 forks source link

ollama._types.ResponseError: model 'llama2' not found #149

Closed nahidalam closed 1 month ago

nahidalam commented 1 month ago

I tried testing this basic script below but it throws error that model llama2 is not found

import ollama
response = ollama.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: model 'llama2' not found, try pulling it first

If I do model='llama3', it works fine

n-vent commented 1 month ago

Try to pull the llama2 model first from the ollama library:

In the shell:

ollama pull llama2

Or in Python:

import ollama
ollama.pull('llama2')