hegelai / prompttools

Open-source tools for prompt testing and experimentation, with support for both LLMs (e.g. OpenAI, LLaMA) and vector databases (e.g. Chroma, Weaviate, LanceDB).
http://prompttools.readthedocs.io
Apache License 2.0
2.55k stars 216 forks source link

Add OpenAI client support to logger.py #118

Open steventkrawczyk opened 6 months ago

steventkrawczyk commented 6 months ago

Overview of Change

OpenAI has a new API format that uses a client object, so we need to change the way we monkey patch logging

from openai import OpenAI
client = OpenAI()

response = client.chat.completions.create(
  model="gpt-3.5-turbo",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who won the world series in 2020?"},
    {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
    {"role": "user", "content": "Where was it played?"}
  ]
)

The intention is for our code to support the following

from openai import OpenAI
import prompttools.logger
client = OpenAI()

response = client.chat.completions.create(
  model="gpt-3.5-turbo",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who won the world series in 2020?"},
    {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
    {"role": "user", "content": "Where was it played?"}
  ]
)

Why is this the best approach?

This is the most flexible way to patching things in, rather than having users import a wrapped object from our library or wrapping clients themselves. Ideally, users are building more complex apps with some kind of dependency injection to manage the OpenAI client and they only need to patch in one place.