Scale3-Labs / langtrace

Langtrace πŸ” is an open-source, Open Telemetry based end-to-end observability tool for LLM applications, providing real-time tracing, evaluations and metrics for popular LLMs, LLM frameworks, vectorDBs and more.. Integrate using Typescript, Python. πŸš€πŸ’»πŸ“Š
https://langtrace.ai
GNU Affero General Public License v3.0
311 stars 26 forks source link

How do I set the service.name of the OTLP span? #177

Closed codefromthecrypt closed 1 week ago

codefromthecrypt commented 1 week ago

I have the code below, which exports spans. However, I'm not sure how to set the service.name property when initializing. Other tools add a kward like app_name or application_name.

import os
from langtrace_python_sdk import langtrace
from ollama import chat
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

def initialize_langtrace():
    # TODO: how to set langtrace-python-ollama as service.name?
    # Default the standard ENV variable to localhost
    otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_TRACES_ENDPOINT", "http://localhost:4318/v1/traces")
    otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)
    # Don't batch spans, as this is a demo
    langtrace.init(custom_remote_exporter=SimpleSpanProcessor(otlp_exporter), batch=False)

# Chat with Ollama, noting OLLAMA_HOST defaults to localhost
def chat_with_ollama():
    from ollama import chat
    messages = [
      {
        'role': 'user',
        'content': "<|fim_prefix|>def hello_world():<|fim_suffix|><|fim_middle|>",
      },
    ]
    response = chat('codegemma:2b-code', messages=messages)
    print(response['message']['content'])

def main():
    initialize_langtrace()
    chat_with_ollama()

if __name__ == "__main__":
    main()

I am able to proceed, noting I have to manually install the dep openai for some reason (I thought it would be implicit with langtrace...)

$ pipenv install langtrace-python-sdk ollama openai
--snip--
$ pipenv run python main.py
pipenv run python main.py 
Initializing Langtrace SDK..
Exporting spans to custom remote exporter..
print("Hello, World!")
codefromthecrypt commented 1 week ago

fyi I see the Exporting spans to custom remote exporter.., but when that occurs, I don't see them in my exporter (unlike same config elsewhere). maybe I'm doing something wrong, so perhaps second eyes (manual test/verification) or an integration test could help.

codefromthecrypt commented 1 week ago

If you have an idea what might be wrong, much oblidged. If I set write_spans_to_console=True it indeed prints a span

{
    "name": "ollama.chat",
    "context": {
        "trace_id": "0x4fe7f838af4ab1751c0445f0147e7102",
        "span_id": "0x124cffab71c30fbc",
        "trace_state": "[]"
    },
    "kind": "SpanKind.CLIENT",
    "parent_id": null,
    "start_time": "2024-07-07T06:11:28.041919Z",
    "end_time": "2024-07-07T06:11:29.695826Z",
    "status": {
        "status_code": "OK"
    },
    "attributes": {
        "langtrace.service.name": "Ollama",
        "langtrace.service.type": "llm",
        "langtrace.service.version": "0.2.1",
        "langtrace.version": "2.1.28",
        "langtrace.sdk.name": "langtrace-python-sdk",
        "url.full": "http://127.0.0.1:11434",
        "llm.api": "/api/chat",
        "llm.prompts": "[{\"role\": \"user\", \"content\": \"<|fim_prefix|>def hello_world():<|fim_suffix|><|fim_middle|>\"}]",
        "llm.token.counts": "{\"input_tokens\": 24, \"output_tokens\": 7, \"total_tokens\": 31}",
        "llm.finish_reason": "stop",
        "llm.responses": "[{\"role\": \"assistant\", \"content\": \"\\n\"}]"
    },
    "events": [],
    "links": [],
    "resource": {
        "attributes": {
            "telemetry.sdk.language": "python",
            "telemetry.sdk.name": "opentelemetry",
            "telemetry.sdk.version": "1.25.0",
            "service.name": "main.py"
        },
        "schema_url": ""
    }
}
alizenhom commented 1 week ago

Hey @codefromthecrypt,

thankyou for reaching out and providing the above detailed description,

can you kindly confirm the following

  1. You want to export ollama traces using oTel Exporter? is it failing? can you give us some more details of how did you setup your collector, using docker? or locally?
  2. Regarding service.name, that's actually a good catch as currently the service.name is automatically set to the file name that started initialising langtrace basically, we can add an argument for service name to solve this issue and implementation will look like this
langtrace.init(service_name='langtrace-python-ollama')

kindly let me know if there is anything else missed, meanwhile will work on adding the service.name fix and let you know

alizenhom commented 1 week ago

Hey @codefromthecrypt,

this fix has been released to python sdk Python SDK 2.1.29

you can pass in service_name as an option while initializing langtrace

langtrace.init(service_name=<your service name>)
codefromthecrypt commented 1 week ago

here is the working code. Thanks for all the help! I had a slight glitch in my exporter arg, which silently failed. it is working now.

import os
from langtrace_python_sdk import langtrace
from ollama import chat
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

def initialize_langtrace():
    # Set the service name such that it is different from other experiments
    service_name = "langtrace-python-ollama"
    # Default the standard ENV variable to localhost
    otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_TRACES_ENDPOINT", "http://localhost:4318/v1/traces")
    otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)
    # Don't batch spans, as this is a demo
    langtrace.init(service_name=service_name, custom_remote_exporter=otlp_exporter, batch=False)

# Chat with Ollama, noting OLLAMA_HOST defaults to localhost
def chat_with_ollama():
    from ollama import chat
    messages = [
      {
        'role': 'user',
        'content': "<|fim_prefix|>def hello_world():<|fim_suffix|><|fim_middle|>",
      },
    ]
    response = chat('codegemma:2b-code', messages=messages)
    print(response['message']['content'])

def main():
    initialize_langtrace()
    chat_with_ollama()

if __name__ == "__main__":
    main()