ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.75k stars 227 forks source link

README doesn't mention that a running ollama server is required #29

Open jmccrosky opened 5 months ago

jmccrosky commented 5 months ago

It seems I'm not the only one that looked at the README and assumed that the library is taking care of running the backend, resulting in a "Connection Refused" error when I try the example code in the README. If I understand well, I need to first run the ollama server. This should perhaps be made clear in the README.

g1ra commented 5 months ago

this must be very frustrating for most of the users starting

diegodmb commented 5 months ago

How can I run de Ollama Server? I'm encounter this error in my django App: [WinError 10061] No se puede establecer una conexión ya que el equipo de destino denegó expresamente dicha conexión Thank you so much for your help

diegodmb commented 4 months ago

I solve it running the Docker Container. But it works really really slow using a light model like Phi.

connor-makowski commented 4 months ago

You need to have a local ollama server running to be able to continue. To do this:

There is a PR to add this to the docs (mentioned above).

connor-makowski commented 4 months ago

I had to update the Readme when working on iterative chats (chats with history), so I ended up migrating in these changes into the PR here: https://github.com/ollama/ollama-python/pull/64