openai / openai-node

Official JavaScript / TypeScript library for the OpenAI API
https://www.npmjs.com/package/openai
Apache License 2.0
7.98k stars 870 forks source link

MaxListenersExceededWarning: Possible EventEmitter memory leak detected. #279

Closed ManInTheWind closed 1 year ago

ManInTheWind commented 1 year ago

Confirm this is a Node library issue and not an underlying OpenAI API issue

Describe the bug

when i creaing a completions with stream, I got this warning. what should I do?

To Reproduce

const stream: Stream<ChatCompletionChunk> =
      await this.openAI.chat.completions.create(
        {
          model: 'gpt-3.5-turbo-0613',
          max_tokens: 3000,
          stream: true,
          temperature: 1,
          user: chatRequestDto.identity,
          messages: [
            {
              role: 'user',
              content: chatRequestDto.prompt,
            },
          ],
        },
        { stream: true },
      );
    const readableStream = new Readable();
    readableStream._read = async () => {
      for await (const part of stream) {
        console.log('finish_reason:', part.choices[0].finish_reason);
        console.log('part:', part.choices[0].delta);
        if (part.choices[0].finish_reason === 'stop') {
          // stream end
          readableStream.push(null);
          break;
        }
       ...
        readableStream.push(JSON.stringify(part));
      }
    };
    return readableStream;

the warning stack trace below:

(node:1700) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 error listeners added to [PassThrough]. Use emitter.setMaxListeners() to increase limit
(Use `node --trace-warnings ...` to show where the warning was created)
MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 error listeners added to [PassThrough]. Use emitter.setMaxListeners() to increase limit
    at _addListener (node:events:588:17)
    at PassThrough.addListener (node:events:606:10)
    at PassThrough.Readable.on (node:internal/streams/readable:887:35)
    at eos (node:internal/streams/end-of-stream:199:12)
    at createAsyncIterator (node:internal/streams/readable:1114:19)
    at createAsyncIterator.next (<anonymous>)
    at Stream.iterMessages (/develop/node_modules/openai/src/streaming.ts:26:16)
    at iterMessages.next (<anonymous>)
    at Stream.[Symbol.asyncIterator] (/develop/node_modules/openai/src/streaming.ts:40:16)

Code snippets

No response

OS

macOS

Node version

Node v16.4.2

Library version

openai v3.0.1

rattrayalex commented 1 year ago

Can you share a Stackblitz repo to help us reproduce this?

(In the short term, you can ignore this warning)

rattrayalex commented 1 year ago

I'm going to go ahead and close this for now, but if you're able to share a reliable repro, we'll happily reopen!