Closed codefromthecrypt closed 1 week ago
fyi I see the Exporting spans to custom remote exporter..
, but when that occurs, I don't see them in my exporter (unlike same config elsewhere). maybe I'm doing something wrong, so perhaps second eyes (manual test/verification) or an integration test could help.
If you have an idea what might be wrong, much oblidged. If I set write_spans_to_console=True
it indeed prints a span
{
"name": "ollama.chat",
"context": {
"trace_id": "0x4fe7f838af4ab1751c0445f0147e7102",
"span_id": "0x124cffab71c30fbc",
"trace_state": "[]"
},
"kind": "SpanKind.CLIENT",
"parent_id": null,
"start_time": "2024-07-07T06:11:28.041919Z",
"end_time": "2024-07-07T06:11:29.695826Z",
"status": {
"status_code": "OK"
},
"attributes": {
"langtrace.service.name": "Ollama",
"langtrace.service.type": "llm",
"langtrace.service.version": "0.2.1",
"langtrace.version": "2.1.28",
"langtrace.sdk.name": "langtrace-python-sdk",
"url.full": "http://127.0.0.1:11434",
"llm.api": "/api/chat",
"llm.prompts": "[{\"role\": \"user\", \"content\": \"<|fim_prefix|>def hello_world():<|fim_suffix|><|fim_middle|>\"}]",
"llm.token.counts": "{\"input_tokens\": 24, \"output_tokens\": 7, \"total_tokens\": 31}",
"llm.finish_reason": "stop",
"llm.responses": "[{\"role\": \"assistant\", \"content\": \"\\n\"}]"
},
"events": [],
"links": [],
"resource": {
"attributes": {
"telemetry.sdk.language": "python",
"telemetry.sdk.name": "opentelemetry",
"telemetry.sdk.version": "1.25.0",
"service.name": "main.py"
},
"schema_url": ""
}
}
Hey @codefromthecrypt,
thankyou for reaching out and providing the above detailed description,
can you kindly confirm the following
service.name
, that's actually a good catch as currently the service.name
is automatically set to the file name that started initialising langtrace basically, we can add an argument for service name to solve this issue and implementation will look like thislangtrace.init(service_name='langtrace-python-ollama')
kindly let me know if there is anything else missed, meanwhile will work on adding the service.name fix and let you know
Hey @codefromthecrypt,
this fix has been released to python sdk Python SDK 2.1.29
you can pass in service_name
as an option while initializing langtrace
langtrace.init(service_name=<your service name>)
here is the working code. Thanks for all the help! I had a slight glitch in my exporter arg, which silently failed. it is working now.
import os
from langtrace_python_sdk import langtrace
from ollama import chat
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
def initialize_langtrace():
# Set the service name such that it is different from other experiments
service_name = "langtrace-python-ollama"
# Default the standard ENV variable to localhost
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_TRACES_ENDPOINT", "http://localhost:4318/v1/traces")
otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)
# Don't batch spans, as this is a demo
langtrace.init(service_name=service_name, custom_remote_exporter=otlp_exporter, batch=False)
# Chat with Ollama, noting OLLAMA_HOST defaults to localhost
def chat_with_ollama():
from ollama import chat
messages = [
{
'role': 'user',
'content': "<|fim_prefix|>def hello_world():<|fim_suffix|><|fim_middle|>",
},
]
response = chat('codegemma:2b-code', messages=messages)
print(response['message']['content'])
def main():
initialize_langtrace()
chat_with_ollama()
if __name__ == "__main__":
main()
I have the code below, which exports spans. However, I'm not sure how to set the service.name property when initializing. Other tools add a kward like app_name or application_name.
I am able to proceed, noting I have to manually install the dep openai for some reason (I thought it would be implicit with langtrace...)