acheong08 / ChatGPTProxy

Simple Cloudflare bypass for ChatGPT
The Unlicense
1.32k stars 326 forks source link

reduce message latency by introducing flushing operation #62

Closed acheong08 closed 1 year ago

acheong08 commented 1 year ago

Previous implementation uses gin.Stream to copy message to proxy client. This can introduce message sending cached in server side and latency happens. This causes proxy client receives a bunch of messages one time, makes proxy clients have worse experience than openai official chabot. Proxy clients become smoother after this commit.

acheong08 commented 1 year ago
18870 commented 1 year ago

4096 bytes seems to be too long for GPT4, sometimes it takes 3-6 seconds to flush once