lukeed / fetch-event-stream

A tiny (736b) utility for Server Sent Event (SSE) streaming via `fetch` and Web Streams API
MIT License
376 stars 2 forks source link

Error "iter is not async iterable" in mod.js #4

Closed hmmhmmhm closed 7 months ago

hmmhmmhm commented 7 months ago

Hello, I'm encountering issues with iteration when attempting to use this library with TypeScript in a Create React App (CRA) environment. Could you advise if there are any additional configurations or settings required in the tsconfig or elsewhere?

tsconfig.json

{
  "compilerOptions": {
    "target": "esnext",
    "lib": ["dom", "dom.iterable", "esnext", "esnext.asynciterable"],
    "allowJs": true,
    "skipLibCheck": true,
    "esModuleInterop": true,
    "allowSyntheticDefaultImports": true,
    "strict": true,
    "forceConsistentCasingInFileNames": true,
    "noFallthroughCasesInSwitch": true,
    "module": "esnext",
    "moduleResolution": "node",
    "resolveJsonModule": true,
    "isolatedModules": true,
    "noEmit": true,
    "jsx": "react-jsx"
  },
  "include": ["src"]
}

package.json

  "dependencies": {
    "fetch-event-stream": "^0.1.3",
    "react": "^18.2.0",
    "react-dom": "^18.2.0",
    "react-scripts": "5.0.1",
    "typescript": "^4.9.5",
  },
JohnRSim commented 7 months ago

update your package 0.1.5 - I had this issue yesterday new update just went out.

hmmhmmhm commented 7 months ago

I'll try it now!

hmmhmmhm commented 7 months ago

@JohnRSim I upgraded to the latest version, which resolved the previous error. However, I've encountered a new issue: the Server-Sent Events (SSE) I'm receiving from the server are not looping.

import { stream } from "fetch-event-stream";

const question = async (threadId: string, message: string) => {
  const url = `${process.env.baseUrl}/v1/api/assistant/question`;
  const events = await stream(url, {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
    },
    body: JSON.stringify({
      threadId,
      message,
    }),
  });
  return events;
};

const events = await question(chatState.threadId, question);
console.log("events", events);
for await (let event of events) {
  console.log(">>", event.data);
}

As you can see in the image above, the loop statement didn't work.

Here's what I'm currently using without this module, and without any errors.

const question = async (threadId: string, message: string) => {
  const url = `${process.env.baseUrl}/v1/api/assistant/question`;
  const response = await fetch(url, {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
    },
    body: JSON.stringify({
      threadId,
      message,
    }),
  });

  if (!response.body) {
    throw new Error("ReadableStream not yet supported in this browser.");
  }

  const reader = response.body.getReader();
  const decoder = new TextDecoder();
  async function* readChunks() {
    let done = false;
    while (!done) {
      const result = await reader.read();
      if (result.done) {
        done = true;
      } else {
        const chunk = decoder.decode(result.value, { stream: true });
        yield chunk;
      }
    }
    yield decoder.decode();
  }
  return readChunks();
};

const texts = await question(chatState.threadId, question);
console.log("events", texts);
for await (let text of texts) {
  console.log(">>", text);
}

I don't know what's wrong, is there something I need to fix?

lukeed commented 7 months ago

Can you share browser information? And expand the source so can see where the syntax error is? The 1:1 in screenshot isn't helpful

hmmhmmhm commented 7 months ago

@lukeed got it, i'll share it asap

hmmhmmhm commented 7 months ago

@lukeed Sorry for the delay in getting to the bottom of this.

This is very hard to believe, but the issue seems to occur when a Wrangler that calls a Cloudflare Worker receives a stream from an Open A.I. and sends it out. (Luckily, it's reproducible locally.)

I've written a how-to README.md with code in a separate repo. https://github.com/hmmhmmhm/sample-sse-error

lukeed commented 7 months ago

If you’ve worked with Wrangler / cfw for long enough, weird quirks aren’t hard to believe anymore 😅 I’ll check it out when I’m at my desk next

lukeed commented 7 months ago

@hmmhmmhm You're not actually sending Server-Sent Events. There's a format to follow; see here. Instead you're just streaming text stings, and your test1 is just piping everything on the client instead of trying to parse (like fetch-event-stream) is doing.

To fix your sample app, edit src/index.ts backend:

    new Promise<void>(async (resolve, reject) => {
      // loop over the data as it is streamed from OpenAI and write it using our writeable
      for await (const part of stream) {
        console.log(part.choices[0]?.delta?.content || "");
--        writer.write(textEncoder.encode(part.choices[0]?.delta?.content || ""));
++        let msg = part.choices[0]?.delta?.content || "";
++        if (msg) {
++          writer.write(textEncoder.encode("data: " + msg + "\n\n"));
++        }
      }
      writer.close();
      resolve();
    });

Now test2 (this lib) will work, cuz it's now finding & emitting valid event objects. And your test1 implementation will look like this, since it's piping/forwarding the raw bytes:

Image

Closing as there's no library issue here. Hope that helps

hmmhmmhm commented 7 months ago

Oh... I didn't realize it had to be a valid event object. Thank you so much...!!

lukeed commented 7 months ago

No problem. In this case, you probably want to use fetch-event-stream in your backend and have it write strings to the Response. It would replace the openai package. (I have a module coming that wraps all of this up for as a standalone OpenAI mini-SDK)

// src/index.ts
import { stream } from 'fetch-event-stream';

// ...

let events = await stream('https://api.openai.com/v1/chat/completions', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer ${env.OPENAI_KEY}`,
  },
  body: JSON.stringify({
    model: "gpt-3.5-turbo",
    stream: true,
    messages: [
      { role: "user", content: "Tell me a story" }
    ],
  })
});

let { readable, writable } = new TransformStream();

let writer = writable.getWriter();
let encoder = new TextEncoder();

try {
  for await (let event of events) {
    if (!event.data) continue;
    if (event.data === '[DONE]') break;

    let msg = JSON.parse(event.data);
    let txt = msg.choices[0]?.delta?.content || '';
    if (txt.length > 0) writer.write(encoder.encode(txt));
  }
} finally {
  writer.close();
}

return new Response(readable, {
  // ...
});

And then you keep your original test1 on client-side, where you just forward the incoming strings.

hmmhmmhm commented 7 months ago

@lukeed Can you check one more thing?

As you said, I followed SSE's standards and all the whitespace disappeared. image

image So I check the value to see what Open A.I was sending me, and I saw that I was getting some nice data with spaces in front of it. So... I think what's happening is that there's a space in front of data:, and then it disappears.

https://github.com/lukeed/fetch-event-stream/blob/main/utils.ts#L9

Is there any way... you could improve this? Above, you introduced how to call openai directly as a stream from the server, but I have a complex call to the Assistant API, which requires the use of the openai library.

lukeed commented 7 months ago

I'm not sure what you're asking. The SSE protocol is allows for 0 or 1 spaces behind the field name, which is what fetch-event-stream handles. OpenAI sends most of their data chunks with a leading space, except start of sentences and except punctuation.

The code I posted works fine. The leading spaces concatenate behind each previous word, which is necessary. Here's a final example output (using your FE code):

>>  Lily and the deer, and word of their friendship spread far and wide. People came from all over to see the extraordinary sight of a girl and a deer living as companions.

Lily and the deer's friendship taught the villagers the importance of kindness and compassion towards all living beings. They learned that true friendship knows no bounds and can transcend even the barriers between different species.

And so, Lily and her deer lived happily ever after, their bond forever unbreakable and their story a testament to the power of love and friendship.

The above was streamed in with leading spaces (cuz thats what OpenAI sent):

Image

Full BE code (from previous message):

// src/index.ts
import { stream } from "fetch-event-stream";

export interface Env {
  OPEN_AI_KEY: string;
}

export default {
  async fetch(
    request: Request,
    env: Env,
    ctx: ExecutionContext,
  ): Promise<Response> {
    // const openai = new OpenAI({
    //   apiKey: env.OPEN_AI_KEY,
    // });

    let events = await stream("https://api.openai.com/v1/chat/completions", {
      method: "POST",
      headers: {
        "Authorization": `Bearer ${env.OPEN_AI_KEY}`,
        "Content-Type": `application/json`,
      },
      body: JSON.stringify({
        model: "gpt-3.5-turbo",
        stream: true,
        messages: [
          { role: "user", content: "Tell me a story" },
        ],
      }),
    });

    // Using our readable and writable to handle streaming data
    let { readable, writable } = new TransformStream();

    let encoder = new TextEncoder();
    let writer = writable.getWriter();

    try {
      for await (let event of events) {
        if (!event.data) continue;
        console.log("<<", event.data);
        if (event.data === "[DONE]") break;
        let msg = JSON.parse(event.data);
        let text = msg.choices[0]?.delta?.content || "";
        if (text.length > 0) writer.write(encoder.encode(text));
      }
    } finally {
      writer.close();
    }

    // Send readable back to the browser so it can read the stream content
    return new Response(readable, {
      headers: {
        "content-type": "text/event-stream",
        "Cache-Control": "no-cache",
        "Connection": "keep-alive",
        "Access-Control-Allow-Origin": "*",
        "Access-Control-Allow-Headers":
          "Origin, X-Requested-With, Content-Type, Accept",
      },
    });
  },
};
hmmhmmhm commented 7 months ago

I updated the example based on the code you shared, but when using the fetch-event-stream library directly on the server, there is no SSE response in the browser. (Only on the Test 2 button)

Wrangler is getting good responses from Open A.I., but the values Wrangler sends to the Browser are not responding correctly. Could you please double check the updated example? https://github.com/hmmhmmhm/sample-sse-error

lukeed commented 7 months ago

Yes the backend code I pasted doesn’t send SSEs to the frontend, so only test1 works. I didn’t change anything in the front end.

In your codebase, fetch-event-stream would just replace “openai” sdk in backend

Whether or not your can/want to is up to you :)