lennartpollvogt / ollama-instructor

Python library for the instruction and reliable validation of structured outputs (JSON) of Large Language Models (LLMs) with Ollama and Pydantic. -> Deterministic work with LLMs.
MIT License
65 stars 3 forks source link

Use ollama-instructor with local ollama models instead of API? #10

Open LisaGsGithub opened 1 month ago

LisaGsGithub commented 1 month ago

Thank you for your work. I am having issues with an unstable connection when using the Ollama API. Is it possible to use your code with a local Ollama model?

lennartpollvogt commented 1 month ago

Hi @LisaGsGithub,

thank you very much!

What do you mean with "Ollama API"? Are you running the Ollama application (https://ollama.com/download) on a web server or have you installed it on you pc/laptop locally?

I try to guess what the issue is: I am running the Ollama server locally on my laptop. And I use ollama-instructor to develop and test the repository by using the local server (http://localhost:11434). So, yes you can use ollama-instructor or the ollama client itself to make something with a local Ollama model (e.g. llama3.2).

Example for Ollama client:

from ollama import Client

client = Client(host="http://localhost:11434")

response = client.chat(
    model='llama3.2',
    messages=[
        {
            'role': 'user',
            'content': 'Why is the sky blue?'
        }
    ]
)

print(response['message']['content'])

As ollama-instructor is just a wrapper around the Ollama client, it is designed to be used with the Ollama server.

Example for use of ollama-instructor:

from ollama_instructor.ollama_instructor_client import OllamaInstructorClient
from pydantic import BaseModel

class Person(BaseModel):
    name: str
    age: int

client = OllamaInstructorClient(...)
response = client.chat_completion(
    model='phi3',
    pydantic_model=Person,
    messages=[
        {
            'role': 'user',
            'content': 'Jason is 30 years old.'
        }
    ]
)

print(response['message']['content'])

Did I answer your question or misinterpreted it?