lunary-ai / lunary

The production toolkit for LLMs. Observability, prompt management and evaluations.
https://lunary.ai
Apache License 2.0
1.07k stars 129 forks source link

Server-side error with vercel 'ai' npm package #126

Closed gardner closed 4 months ago

gardner commented 9 months ago

Hi there, just reporting this in case it's an easy fix. It looks like vercel's ai npm package wraps things up in a weird way that isn't compatible with Lunary.

This can be reproduced by cloning the ai-chatbot and wrapping the openai call with monitorOpenAI in ./app/api/chat/route.ts

./app/api/chat/route.ts:40:31
20:50:29.356 | Type error: Argument of type 'ChatCompletion' is not assignable to parameter of type 'Response \| AsyncIterableOpenAIStreamReturnTypes'.
20:50:29.357 |  
20:50:29.357 | 38 \|   })
20:50:29.357 | 39 \|
20:50:29.357 | > 40 \|   const stream = OpenAIStream(res, {
20:50:29.357 | \|                               ^
20:50:29.357 | 41 \|     async onCompletion(completion) {
20:50:29.357 | 42 \|       const title = json.messages[1].content.substring(0, 100)
20:50:29.358 | 43 \|       const id = json.id ?? nanoid()
20:50:29.409 | ELIFECYCLE  Command failed with exit code 1.
20:50:29.436 | Error: Command "pnpm run build" exited with 1
20:50:29.916
hughcrt commented 8 months ago

Hi,

We need to update our SDK and examples to explain how to make it work with Next.js. In the meantine, here is an example that works well with Next.js App router and Lunary:

lunary.init({ appId: "..." });
const openai = new OpenAI({
  apiKey: "sk-...",
});

monitorOpenAI(openai);
export const runtime = "edge";

export async function GET() {
  const result = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    temperature: 0.9,
    stream: true,
    messages: [
      { role: "system", content: "You are an helpful assistant" },
      { role: "user", content: "Print a random string" },
    ],
  });

  const stream = iteratorToStream(result);
  return new Response(stream);
}

function iteratorToStream(iterator: any) {
  const encoder = new TextEncoder();
  return new ReadableStream({
    async pull(controller) {
      try {
        const { value, done } = await iterator.next();
        if (done) return controller.close();
        const bytes = encoder.encode(JSON.stringify(value) + "\n");
        controller.enqueue(bytes);
      } catch (error) {
        controller.error(error);
      }
    },
  });
}
gardner commented 8 months ago

This is still an issue:

const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY
})
const res = await openai.chat.completions.create({
  model: 'gpt-4-0125-preview',
  user: userId,
  messages,
  temperature: 0.7,
  stream: true
})

res is of type Stream<OpenAI.Chat.Completions.ChatCompletionChunk>

const openai = monitorOpenAI(
  new OpenAI({
    apiKey: process.env.OPENAI_API_KEY
  })
)
const res = await openai.chat.completions.create({
  model: 'gpt-4-0125-preview',
  user: userId,
  messages,
  temperature: 0.7,
  stream: true
})

res is of type OpenAI.Chat.Completions.ChatCompletion

gardner commented 8 months ago

As a workaround:

It seems to work in spite of the type error. Adding @ts-ignore allows the build stage to pass and then the code operates as expected. e.g.

// @ts-ignore
const stream = OpenAIStream(res, {
  async onCompletion(completion) {

It looks like there are some pretty gnarly type-gymnastics being done inside the openai npm module.

linear[bot] commented 6 months ago

LLM-769 Server-side error with vercel 'ai' npm package

hughcrt commented 5 months ago

Hi @BigStar-2024, yes you can

7HR4IZ3 commented 4 months ago

Regarding the types, everything seems to be working fine for me:

Although I did update the code to use the openai openai.streaming.Stream object's tee function if possible so the returned value of the create function is the same regardless of if you wrap OpenAI with monitorOpenAI or not.

I'll write some tests for it then submit a PR

With Lunary and streaming 1001190614-01

With Lunary and without streaming 1001190615-01

Without Lunary but with streaming 1001190616-01

Without Lunary and without streaming 1001190617-01