ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

[query] Is it not possible to run ollama python library on a colab notebook ? #67

Closed timtensor closed 4 months ago

timtensor commented 4 months ago

Hi , I am getting some error while running ollama python library in colab notebook . Is it not possible to do so ? I am getting connection error

Perhaps it is not meant to be like that ?

connor-makowski commented 4 months ago

I think the intention is that Ollama is run locally on your machine. If you are using colab, then you would need to initialize a client that is connected to a running ollama model that has external connections.

I assume that running ollama on a server would involve some challenges to secure for open internet access. But assuming it is running somewhere, you could do:

from ollama import Client
client = Client(host='https://YourExternalClient.com')
response = client.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
mxyng commented 4 months ago

This library is intended to be the client library. It requires an running ollama instance to function. If that's what you're interested in, there's an example notebook in the ollama/ollama repo you can checkout: https://github.com/ollama/ollama/tree/main/examples/jupyter-notebook

Otherwise, it is as @connor-makowski says above.