First things first! You can install the SDK with pip as follows:
pip install observers
Or if you want to use other LLM providers through AISuite or Litellm, you can install the SDK with pip as follows:
pip install observers[aisuite] # or observers[litellm]
We differentiate between observers and stores. Observers wrap generative AI APIs (like OpenAI or llama-index) and track their interactions. Stores are classes that sync these observations to different storage backends (like DuckDB or Hugging Face datasets).
To get started you can run the code below. It sends requests to a HF serverless endpoint and log the interactions into a Hub dataset, using the default store DatasetsStore
. The dataset will be pushed to your personal workspace (http://hf.co/{your_username}). To learn how to configure stores, go to the next section.
from observers.observers import wrap_openai
from observers.stores import DuckDBStore
from openai import OpenAI
store = DuckDBStore()
openai_client = OpenAI()
client = wrap_openai(openai_client, store=store)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Tell me a joke."}],
)
The wrap_openai
function allows you to wrap any OpenAI compliant LLM provider. Take a look at the example doing this for Ollama for more details.
Store | Example | Annotate | Local | Free | UI filters | SQL filters |
---|---|---|---|---|---|---|
Hugging Face Datasets | example | β | β | β | β | β |
DuckDB | example | β | β | β | β | β |
Argilla | example | β | β | β | β | β |
To view and query Hugging Face Datasets, you can use the Hugging Face Datasets Viewer. You can find example datasets on the Hugging Face Hub. From within here, you can query the dataset using SQL or using your own UI. Take a look at the example for more details.
The default store is DuckDB and can be viewed and queried using the DuckDB CLI. Take a look at the example for more details.
> duckdb store.db
> from openai_records limit 10;
ββββββββββββββββββββββββ¬βββββββββββββββββββββββ¬βββββββββββββββββββββββ¬βββββββββββββββββββββββ¬ββββ¬ββββββββββ¬βββββββββββββββββββββββ¬ββββββββββββ
β id β model β timestamp β messages β β¦ β error β raw_response β synced_at β
β varchar β varchar β timestamp β struct("role" varcβ¦ β β varchar β json β timestamp β
ββββββββββββββββββββββββΌβββββββββββββββββββββββΌβββββββββββββββββββββββΌβββββββββββββββββββββββΌββββΌββββββββββΌβββββββββββββββββββββββΌββββββββββββ€
β 89cb15f1-d902-4586β¦ β Qwen/Qwen2.5-Coderβ¦ β 2024-11-19 17:12:3β¦ β [{'role': user, 'cβ¦ β β¦ β β {"id": "", "choiceβ¦ β β
β 415dd081-5000-4d1aβ¦ β Qwen/Qwen2.5-Coderβ¦ β 2024-11-19 17:28:5β¦ β [{'role': user, 'cβ¦ β β¦ β β {"id": "", "choiceβ¦ β β
β chatcmpl-926 β llama3.1 β 2024-11-19 17:31:5β¦ β [{'role': user, 'cβ¦ β β¦ β β {"id": "chatcmpl-9β¦ β β
ββββββββββββββββββββββββ΄βββββββββββββββββββββββ΄βββββββββββββββββββββββ΄βββββββββββββββββββββββ΄ββββ΄ββββββββββ΄βββββββββββββββββββββββ΄ββββββββββββ€
β 3 rows 16 columns (7 shown) β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
The Argilla Store allows you to sync your observations to Argilla. To use it, you first need to create a free Argilla deployment on Hugging Face. Take a look at the example for more details.
See CONTRIBUTING.md