rawandahmad698 / PyChatGPT

⚡️ Python client for the unofficial ChatGPT API with auto token regeneration, conversation tracking, proxy support and more.
MIT License
4.22k stars 447 forks source link

[BUG] The chat.ask gives the output extremely slowly #106

Closed UluuGashim048 closed 1 year ago

UluuGashim048 commented 1 year ago

I am sending only one request, and the chat.ask gives the output only after 2-3 minutes, what might the issue ? Thank you!

rawandahmad698 commented 1 year ago

Not a bug, or an issue. Speed is related to load on their backend servers. Your internet. Etc.

UluuGashim048 commented 1 year ago

But it works quickly with a different PyChatGPT package with the same credentials @rawandahmad698

gerrywastaken commented 1 year ago

@UluuGashim048 does the other package spit out the entire result all at once, or does it stream the result word by word?

rawandahmad698 commented 1 year ago

@UluuGashim048 does the other package spit out the entire result all at once, or does it stream the result word by word?

It's not handled as a text-stream in the code. So it's all at once. Speed is very good on my end. So it must be his internet. Or region.

gerrywastaken commented 1 year ago

@rawandahmad698 Fair enough I hadn't actually checked if the response was streamed or not and kinda assumed. I did previously notice slower results via the library and I was getting much faster results in the my browser at the same time for the same query. However this could just be random chance... unless it is somehow also related to the shaddowban.... (I think there should be a better term for that, perhaps shaddow-limited).