ax-llm / ax

Build LLM powered Agents and "Agentic workflows" based on the Stanford DSP paper.
http://axllm.dev
Apache License 2.0
999 stars 65 forks source link

Where to get tracer output? #58

Open sonicviz opened 1 month ago

sonicviz commented 1 month ago

Hi.

In your Open Telemetry example code you show some setup and then the output, but you don't show where the trace below is being acquired from. How do you access that?

I tried some example code with cot.getTraces() but I only get the actual answer trace, not the statistical data as below from the ai call. But I don't see any getTraces methods in the ai object or ai.tracer object

{ "traceId": "ddc7405e9848c8c884e53b823e120845", "name": "Chat Request", "id": "d376daad21da7a3c", "kind": "SERVER", "timestamp": 1716622997025000, "duration": 14190456.542, "attributes": { "gen_ai.system": "Ollama", "gen_ai.request.model": "nous-hermes2", "gen_ai.request.max_tokens": 500, "gen_ai.request.temperature": 0.1, "gen_ai.request.top_p": 0.9, "gen_ai.request.frequency_penalty": 0.5, "gen_ai.request.llm_is_streaming": false, "http.request.method": "POST", "url.full": "http://localhost:11434/v1/chat/completions", "gen_ai.usage.completion_tokens": 160, "gen_ai.usage.prompt_tokens": 290 } }

dosco commented 1 month ago

In the example code in the README you'll see a tracing provider creates and passes a tracer class options: { tracer } this tracer handles all the collection etc it's implemented by the tracer you can get tracers for google cloud, aws, other cloud or llm observability products, etc.

import { trace } from '@opentelemetry/api';
import {
  BasicTracerProvider,
  ConsoleSpanExporter,
  SimpleSpanProcessor
} from '@opentelemetry/sdk-trace-base';

const provider = new BasicTracerProvider();
provider.addSpanProcessor(new SimpleSpanProcessor(new ConsoleSpanExporter()));
trace.setGlobalTracerProvider(provider);

const tracer = trace.getTracer('test');

const ai = new AxAI({
  name: 'ollama',
  config: { model: 'nous-hermes2' },
  options: { tracer }
});

const gen = new AxChainOfThought(
  ai,
  `text -> shortSummary "summarize in 5 to 10 words"`
);

const res = await gen.forward({ text });