run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.26k stars 4.96k forks source link

[Question]: observability integration for open source models #14955

Closed JINO-ROHIT closed 1 month ago

JINO-ROHIT commented 1 month ago

Question Validation

Question

Is there a observability tool that works with open source models and not openai?

dosubot[bot] commented 1 month ago

I recommend using Langfuse or Arize AI as observability tools that work with open-source models and not OpenAI.

Langfuse

Langfuse is an open-source LLM engineering platform to help teams collaboratively debug, analyze, and iterate on their LLM Applications. With the Langfuse integration, you can seamlessly track and monitor performance, traces, and metrics of your LlamaIndex application.

Usage Pattern

from llama_index.core import set_global_handler

# Make sure you've installed the 'llama-index-callbacks-langfuse' integration package.

# NOTE: Set your environment variables 'LANGFUSE_SECRET_KEY', 'LANGFUSE_PUBLIC_KEY' and 'LANGFUSE_HOST'
# as shown in your langfuse.com project settings.

set_global_handler("langfuse")

Guides

langfuse-tracing

Arize AI

Arize AI is another observability tool that can be used with open-source models. It provides robust monitoring and debugging capabilities for LLM applications.

Usage Pattern

%pip install --upgrade \
    openinference-instrumentation-llama-index \
    opentelemetry-sdk \
    opentelemetry-exporter-otlp \
    "opentelemetry-proto>=1.12.0" \
    arize-phoenix -q
import os
get_ipython().system = os.system

!python -m phoenix.server.main serve > arize.log 2>&1 &
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
    OTLPSpanExporter,
)
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(
    SimpleSpanProcessor(OTLPSpanExporter(endpoint))
)

LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)

Now open a web browser and enter the url http://localhost:6006/.

Both Langfuse and Arize AI offer comprehensive observability solutions for open-source LLM applications, allowing you to monitor and debug your models effectively [1][2][3].

To continue talking to Dosu, mention @dosu.

logan-markewich commented 1 month ago

Any observability tool works with any model?

I would try arize