Scale3-Labs / langtrace-python-sdk

Langtrace SDK for Python Applications
https://langtrace.ai
Apache License 2.0
17 stars 4 forks source link
ai llm observability open-telemetry python tracing

Langtrace

Open Source & Open Telemetry(OTEL) Observability for LLM applications

Static Badge Static Badge Static Badge Static Badge


Langtrace is an open source observability software which lets you capture, debug and analyze traces and metrics from all your applications that leverages LLM APIs, Vector Databases and LLM based Frameworks.

Open Telemetry Support

The traces generated by Langtrace adhere to Open Telemetry Standards(OTEL). We are developing semantic conventions for the traces generated by this project. You can checkout the current definitions in this repository. Note: This is an ongoing development and we encourage you to get involved and welcome your feedback.


Langtrace Cloud ☁️

To use the managed SaaS version of Langtrace, follow the steps below:

  1. Sign up by going to this link.
  2. Create a new Project after signing up. Projects are containers for storing traces and metrics generated by your application. If you have only one application, creating 1 project will do.
  3. Generate an API key by going inside the project.
  4. In your application, install the Langtrace SDK and initialize it with the API key you generated in the step 3.
  5. The code for installing and setting up the SDK is shown below

Getting Started

Get started by adding simply three lines to your code!

pip install langtrace-python-sdk
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key=<your_api_key>)

OR

from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init() # LANGTRACE_API_KEY as an ENVIRONMENT variable

FastAPI Quick Start

Initialize FastAPI project and add this inside the main.py file

from fastapi import FastAPI
from langtrace_python_sdk import langtrace
from openai import OpenAI

langtrace.init()
app = FastAPI()
client = OpenAI()

@app.get("/")
def root():
    client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": "Say this is a test three times"}],
        stream=False,
    )
    return {"Hello": "World"}

Django Quick Start

Initialize django project and add this inside the __init.py__ file

from langtrace_python_sdk import langtrace
from openai import OpenAI

langtrace.init()
client = OpenAI()

client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Say this is a test three times"}],
    stream=False,
)

Flask Quick Start

Initialize flask project and this inside app.py file

from flask import Flask
from langtrace_python_sdk import langtrace
from openai import OpenAI

langtrace.init()
client = OpenAI()
app = Flask(__name__)

@app.route("/")
def main():
    client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": "Say this is a test three times"}],
        stream=False,
    )
    return "Hello, World!"

Langtrace Self Hosted

Get started by adding simply two lines to your code and see traces being logged to the console!

pip install langtrace-python-sdk
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(write_spans_to_console=True)

Langtrace self hosted custom exporter

Get started by adding simply three lines to your code and see traces being exported to your remote location!

pip install langtrace-python-sdk
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(custom_remote_exporter=<your_exporter>, batch=<True or False>)

Configure Langtrace

Parameter Type Default Value Description
api_key str LANGTRACE_API_KEY or None The API key for authentication.
batch bool True Whether to batch spans before sending them.
write_spans_to_console bool False Whether to write spans to the console.
custom_remote_exporter Optional[Exporter] None Custom remote exporter. If None, a default LangTraceExporter will be used.
api_host Optional[str] https://langtrace.ai/ The API host for the remote exporter.
disable_instrumentations Optional[DisableInstrumentations] None You can pass an object to disable instrumentation for specific vendors ex: {'only': ['openai']} or {'all_except': ['openai']}

Additional Customization

from langtrace_python_sdk import with_langtrace_root_span

@with_langtrace_root_span()
def example():
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": "Say this is a test three times"}],
        stream=False,
    )
    return response
from langtrace_python_sdk import inject_additional_attributes

def do_llm_stuff(name=""):
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": "Say this is a test three times"}],
        stream=False,
    )
    return response

def main():
  response = inject_additional_attributes(lambda: do_llm_stuff(name="llm"), {'user.id': 'userId'})

  # if the function do not take arguments then this syntax will work
  response = inject_additional_attributes(do_llm_stuff, {'user.id': 'userId'})
from langtrace_python_sdk import with_langtrace_root_span, with_additional_attributes

@with_additional_attributes({"user.id": "1234"})
def api_call1():
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": "Say this is a test three times"}],
        stream=False,
    )
    return response

@with_additional_attributes({"user.id": "5678"})
def api_call2():
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": "Say this is a test three times"}],
        stream=False,
    )
    return response

@with_langtrace_root_span()
def chat_completion():
   api_call1()
   api_call2()
from langtrace_python_sdk import get_prompt_from_registry

prompt = get_prompt_from_registry(<Registry ID>, options={"prompt_version": 1, "variables": {"foo": "bar"} })

Opt out of tracing prompt and completion data

By default, prompt and completion data are captured. If you would like to opt out of it, set the following env var,

TRACE_PROMPT_COMPLETION_DATA=false

Supported integrations

Langtrace automatically captures traces from the following vendors:

Vendor Type Typescript SDK Python SDK
OpenAI LLM :white_check_mark: :white_check_mark:
Anthropic LLM :white_check_mark: :white_check_mark:
Azure OpenAI LLM :white_check_mark: :white_check_mark:
Cohere LLM :white_check_mark: :white_check_mark:
Groq LLM :x: :white_check_mark:
Perplexity LLM :white_check_mark: :white_check_mark:
Gemini LLM :x: :white_check_mark:
Langchain Framework :x: :white_check_mark:
LlamaIndex Framework :white_check_mark: :white_check_mark:
Langgraph Framework :x: :white_check_mark:
DSPy Framework :x: :white_check_mark:
CrewAI Framework :x: :white_check_mark:
Ollama Framework :x: :white_check_mark:
VertexAI Framework :x: :white_check_mark:
Vercel AI SDK Framework :white_check_mark: :x:
Pinecone Vector Database :white_check_mark: :white_check_mark:
ChromaDB Vector Database :white_check_mark: :white_check_mark:
QDrant Vector Database :white_check_mark: :white_check_mark:
Weaviate Vector Database :white_check_mark: :white_check_mark:
PGVector Vector Database :white_check_mark: :white_check_mark: (SQLAlchemy)

Feature Requests and Issues


Contributions

We welcome contributions to this project. To get started, fork this repository and start developing. To get involved, join our Discord workspace.


Security

To report security vulnerabilites, email us at security@scale3labs.com. You can read more on security here.


License