Yaffle / EventSource

a polyfill for http://www.w3.org/TR/eventsource/
MIT License
2.11k stars 338 forks source link

why the EventSourcePolyfill send request forever #216

Closed jiangxiaoqiang closed 1 year ago

jiangxiaoqiang commented 1 year ago

I am using EventSourcePolyfill to recived server sse message , this is my client code in react project:

import { chatSseAskAction } from '../../action/chat/ChatAction';
import { IChatAsk } from '../../models/chat/ChatAsk';
import { EventSourcePolyfill } from 'event-source-polyfill';
import { v4 as uuid } from 'uuid';
import store from '../../store/store';

export function doSseChatAsk(params: IChatAsk) {
  debugger
  let eventSource: EventSourcePolyfill;
  const accessToken = localStorage.getItem("x-access-token");
  // https://stackoverflow.com/questions/6623232/eventsource-and-basic-http-authentication
  eventSource = new EventSourcePolyfill('/ai/stream/chat/ask?question=hello', {
    headers: {
      'x-access-token': accessToken ?? "",
      'x-request-id': uuid(),
    }
  });
  eventSource.onmessage = e => {
    store.dispatch(chatSseAskAction(e.data));
  };

}

when the UI click the button, invoke this function to fetch the server side sse message, so we understand this function will only trigger onece when clieck the button, but why the request send again and again?

image

why did this happen and what should I do to fixed this issue, I have tried to using onclose function but seems did not contain the function with the EventSourcePolyfill lib. I also checked the onerror function:

image

did not found any valuable information. I have already tested the server side using this command:


➜  ~ curl -X GET -H 'Content-Type: application/json' -H 'x-request-id:1' -H 'Cache-Control:no-cache'  -H 'x-access-token: eyJhbGciOiJIJ9.eyJZTDkiLC2Nzk.G9Ddi5sBMmiKNaD_vKni-gzN5kdT6426ruo1EDDV29SCFwI0CqlS5hKg6D7Q' -N https://ai.example.top/ai/stream/chat/ask\?question\=1

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " world", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "\n", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "\n", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "Hello", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " world", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "!", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " It", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "'s", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " nice", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " to", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " meet", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " you", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": ".", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}
Yaffle commented 1 year ago

Looks like the server closes the connection and do it retroed after retry timeout

jiangxiaoqiang commented 1 year ago

I am not sure but I have already using this command to tested the server side:

➜  ~ curl -X GET -H 'Content-Type: application/json' -H 'x-request-id:1' -H 'Cache-Control:no-cache'  -H 'x-access-token: eyJhbGciOiJIJ9.eyJZTDkiLC2Nzk.G9Ddi5sBMmiKNaD_vKni-gzN5kdT6426ruo1EDDV29SCFwI0CqlS5hKg6D7Q' -N https://ai.example.top/ai/stream/chat/ask\?question\=1

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " world", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "\n", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "\n", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "Hello", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " world", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "!", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " It", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": "'s", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " nice", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " to", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " meet", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": " you", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data:{"id": "cmpl-6xtZ5KyFfDfa97NJrGHqq2DdsXJQE", "object": "text_completion", "created": 1679732963, "choices": [{"text": ".", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}
Yaffle commented 1 year ago

@jiangxiaoqiang have you found the issue?