guidance-ai / guidance

A guidance language for controlling large language models.
MIT License
18.81k stars 1.04k forks source link

When will ollama be supported? #877

Open nurena24 opened 4 months ago

nurena24 commented 4 months ago

When will ollama be supported?

mkfischer commented 3 months ago

I didn't want to use LiteLLM becase reasons, so I used ollama cp [model] gpt-3.5-turbo like this. This is just a crappy POC but it works and it's simple.

from guidance import models, system, user, assistant, gen
from bs4 import BeautifulSoup
from termcolor import colored
from time import sleep
import subprocess

ollama_model_to_use = "wangrongsheng/openchat:latest"
command = ["ollama", "cp", f"{ollama_model_to_use}", "gpt-3.5-turbo"]

try:
    result = subprocess.run(command, check=True, capture_output=True, text=True)
    print("Command executed successfully.")
    print("Output:", result.stdout)
except subprocess.CalledProcessError as e:
    print("An error occurred while executing the command.")
    print("Error output:", e.stderr)

# Sometimes it would fail if I didn't put this in.  I have no idea why.
sleep(1)

gpt35 = models.OpenAI(
    "gpt-3.5-turbo",
    echo=False,
    api_key="NO_KEY_NEEDED",
    base_url="http://127.0.0.1:11434/v1",
)

lm = gpt35

with system():
    lm += "You only speak in ALL CAPS."

with user():
    lm += "What is the capital of Greenland?"

with assistant():
    response = lm + gen(name="answer", max_tokens=256)
    generated_text = response["answer"]

soup = BeautifulSoup(generated_text, "html.parser")

parsed_text = soup.get_text()

print(colored(parsed_text, "green"))