4as / ChatGPT-DeMod

Tampermonkey/Greasemonkey script that hides the moderation results when communicating with ChatGPT.
GNU General Public License v2.0
420 stars 59 forks source link

Demod massively lags ChatGPT on long conversations #34

Closed virox closed 7 months ago

virox commented 1 year ago

For about a week now, some of my longest threads are barely able to generate a reply, or rather, display the generation properly.

The generated response appears in chunks with long wait times, sometimes, nothing happens at all. After reloading the conversation, I can read the response.

When I disabled DeMod, text generation went back to normal speeds.

This happens on Firefox and Chrome, Android or PC, using Tampermonkey.

4as commented 1 year ago

I'm going to be honest. I think you're mistaken if you think the problem lies in DeMod. I've noticed the problem with performance and spent some time investigating, but all I can see is long processing times on OpenAI side. Here is what I checked: Turning off DeMod and turning off TamperMonkey barely changes anything. Downgrading to 3.4 version of DeMod (which is from August 25) doesn't change anything. Finally using performance profiler in both Firefox and Chrome shows that almost 90% of Javascript execution time is spent framework.js and other ChatGPT scripts. DeMod barely shows up on the graphs.

I'm going to leave this issue open in case someone spots something that I've missed.

virox commented 1 year ago

Thank you for taking the time to reply. The thing is, the lagging immediately stops when DeMod is disabled (via the button on top). Before last week, this didn't happen. After it started, I updated the script from 3.4 to 3.6, but that didn't help.

Since it's happening to every device or browser, it's gotta be something that OpenAi changed.

Could the amount of removals trigger the lagging? The threads in question are pretty much 80% orange.

4as commented 1 year ago

You already mentioned that disabling DeMod helps in your original comment. I've read it. My answer won't change unless I receive new information. Also, ChatGPT's scripts are so obfuscated that I don't really want to go through them to figure out what they messed up.

ShannonW1950 commented 1 year ago

Thank you for taking the time to reply. The thing is, the lagging immediately stops when DeMod is disabled (via the button on top). Before last week, this didn't happen. After it started, I updated the script from 3.4 to 3.6, but that didn't help.

Since it's happening to every device or browser, it's gotta be something that OpenAi changed.

Could the amount of removals trigger the lagging? The threads in question are pretty much 80% orange.

I'll have to experiment some, but my current longest running story honestly mostly flags false positives because the filter is twitchy about some words being used in an innocent context. And at least on mobile, turning it off hasn't seemed to help dramatically? Phone and laptop are both pretty high end. I'm considering condensing this story into a new prompt and basically making a part 2 with the prompt having character and plot information from the previous story, to speed things back up. I think it's something on chatgpt end with long threads. If you need demod badly enough, that could be an option.

nizrhane commented 1 year ago

Hi. I have been using DeMod since Monday, and it has worked. Except that today, I noticed that it's causing the lag. I did a test by first turning it off, which stopped the lagging. Confirmed it by opening ChatGPT on another browser without DeMod installed, and no lagging happened.

austin-lopez commented 1 year ago

Can also confirm that I've received a massive lag increase out of nowhere in the last few days that gets fixed when DeMod is disabled.

ShannonW1950 commented 1 year ago

Notably, even on long entries where they lag badly no matter what on the website (mobile or otherwise with or without demod) the mobile app is extremely fast, and doesn't appear to censor messages? It flags them but doesn't seem to remove them. Wondering if it's something specifically website going on otherwise

4as commented 1 year ago

After thoroughly testing the site's behavior I found a way to work around the lag. ChatGPT's code has now significant problems updating the text dynamically in the conversations. Each time the text has to be changed it results in a massive cascading chain of updates that grinds the whole page to halt. What's worst the more updates happen at the same time the longer the freeze will be. And this includes ALL kinds of changes to the conversation. Even by simply going back and forth through the regenerated responses (or resubmitted prompts) the user can cause the page to lag.
I can't do anything about already generated text, but I can affect the way its generated.
I released a new version of DeMod that now forces a significant timeout between each text update during the response generation animation. This means the response will be updated in larger chunks rather then smoothly word by word, but at least the page should no longer freeze.

I'm going to leave the issue open till OpenAI fixes the performance problems on their side.

nizrhane commented 1 year ago

@4as Thank you very much for the update! I've updated it on my end and it seems to be better now. ^^

Kataphractoi commented 8 months ago

Imma be honest there is no speed difference observed with demod on my browser. I only notice a galloping cadence in generation, however on average it is no different to the speed without demod.