Arize-ai / openinference

OpenTelemetry Instrumentation for AI Observability
https://arize-ai.github.io/openinference/
Apache License 2.0
187 stars 31 forks source link

🗺️ Vercel AI SDK #785

Closed mikeldking closed 1 day ago

mikeldking commented 1 month ago

Vercel AI SDK 3.3 introduces OTEL support, which means there should be spans from the inside of the AI SDK's execution. We should possibly introduce a span processor taht maps these to OpenInference semantic conventions.

Spike

mikeldking commented 1 month ago

https://vercel.com/blog/vercel-ai-sdk-3-3

Parker-Stafford commented 3 weeks ago

Dogfooding use cases:

All of these will be using this example: https://github.com/Arize-ai/openinference/tree/main/js/examples/next-openai-telemetry-app

Follow the readme there and paste the code snippet for your use case as instructed below

Embeddings

add the following in chat/route.ts imports:

import { streamText, embed, embedMany } from "ai";

Add in POST function

  await embed({
    model: openai.embedding("text-embedding-ada-002"),
    value: "hello, you are a great bot",
    experimental_telemetry: {
      isEnabled: true,
      metadata: { example: "value" },
    },
  });

  await embedMany({
    model: openai.embedding("text-embedding-ada-002"),
    values: ["hello, you are a great bot", prompt],
    experimental_telemetry: {
      isEnabled: true,
      metadata: { example: "value" },
    },
  });

Stream / GenerateObject

run:

pnpm install zod --ignore-workspace

add the following in chat/route.ts Imports:

import { streamText, streamObject } from "ai"; // add stream object here
import { z } from "zod";

Add in POST function:

  const objectStream = await streamObject({
    model: openai("gpt-3.5-turbo"),
    maxTokens: 100,
    messages: [
      { content: "hello, you are a great bot", role: "system" },
      { content: "please create me 3 objects like this", role: "user" },
    ],
    schema: z.object({
      description: z.string().describe("The generated text"),
      number: z.number().describe("A generated number"),
    }),
    experimental_telemetry: {
      isEnabled: true,
      metadata: { example: "value" },
    },
  });
  const objectStreamResponse = await objectStream.toTextStreamResponse().text();

Tools

run:

pnpm install zod --ignore-workspace

add the following in chat/route.ts Imports:

import { streamText, generateText, tool } from "ai"; // add stream object here
import { z } from "zod";

Add in POST function:

    const { text, toolResults } = await generateText({
    model: openai("gpt-3.5-turbo"),
    maxTokens: 100,
    messages: [
      { content: "hello, you are a great bot", role: "system" },
      { content: "what is the weather in kentucky", role: "user" },
    ],
    tools: {
      weather: tool({
        parameters: z.object({
          location: z.string().describe("The location to get the weather for"),
        }),
        execute: async ({ location }) => ({
          location,
          temperature: 72 + Math.floor(Math.random() * 21) - 10,
        }),
      }),
    },
    experimental_telemetry: {
      isEnabled: true,
      functionId: "example-function-id",
      metadata: { example: "value" },
    },
  });

Hosted Phoenix

Have not tested this yet, you will likely need to make changes to the instrumentation.ts file in the example to set headers etc..

this snippet works for llamatrace

import { registerOTel } from "@vercel/otel";
import { diag, DiagConsoleLogger, DiagLogLevel } from "@opentelemetry/api";
import { OpenInferenceSpanProcessor } from "@arizeai/openinference-vercel";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { SimpleSpanProcessor } from "@opentelemetry/sdk-trace-base";

// For troubleshooting, set the log level to DiagLogLevel.DEBUG
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);

export function register() {
  registerOTel({
    serviceName: "next-app",
    spanProcessors: [
      new OpenInferenceSpanProcessor(),
      new SimpleSpanProcessor(
        new OTLPTraceExporter({
          headers: {
            OTEL_EXPORTER_OTLP_HEADERS: "redacted",
          },
          url: "https://app.phoenix.arize.com/v1/traces",
        }),
      ),
    ],
  });
}

Find and use vercel ai template

Can check here: https://vercel.com/templates

RogerHYang commented 3 weeks ago

tool span attributes probably also need input.value and output.value added

{
  "resource": {
    "name": "example-function-id"
  },
  "openinference": {
    "span": {
      "kind": "TOOL"
    }
  },
  "tool": {
    "name": "weather",
    "parameters": {
      "location": "Kentucky"
    }
  },
  "ai": {
    "operationId": "ai.toolCall",
    "toolCall": {
      "args": "{\"location\":\"Kentucky\"}",
      "name": "weather",
      "result": "{\"location\":\"Kentucky\",\"temperature\":62}",
      "id": "call_tSTYW8vjXPSeDLHhZDBWG3au"
    },
    "telemetry": {
      "functionId": "example-function-id"
    }
  },
  "operation": {
    "name": "ai.toolCall example-function-id"
  }
}
axiomofjoy commented 3 weeks ago
mikeldking commented 1 day ago

Completed other than marketing tasks