Open Baralheia opened 5 days ago
I determined from the errors in the console that Twitter was responding with "local_rate_limited" when the script attempts to call the DeleteTweet endpoint. As of a couple of years ago, it was determined that this local rate limit engages at around 600 requests per minute (https://devcommunity.x.com/t/getting-rate-limited-http-code-429-with-local-rate-limited-but-without-x-rate-limit-remaining-and-x-rate-limit-reset/169036) - or around 10 per second.
As a hacky workaround (since I'm not familiar with userscript coding), I inserted "await this.sleep(110)" immediately after line 367. This slows the requests down to just over 9 per second, and is allowing this script to now run without being rate limited.
While this is a good enough solution for me, hopefully someone who actually knows what they're doing can implement something far more elegant than what I did haha.
The console output from the script every time this happened was as follows:
22:37:09.366 Response { type: "basic", url: "https://x.com/i/api/graphql/VaenaVgh5q5ih7kvyVjgtg/DeleteTweet", redirected: false, status: 429, ok: false, statusText: "", headers: Headers(6), body: ReadableStream, bodyUsed: false } debugger eval code:369:25
Hi there! I'm using v0.6.2 of this script in Firefox 131.0.3 on Linux Mint 23. After removing around 7k tweets, I began seeing a ton of HTTP 429: Too Many Requests errors in the console. The script appears to still be chugging away and the number of reported deletions continues to increase despite these errors. Could the script detect when the server is overwhelmed with requests and either pause or throttle itself to maximize the number of tweets removed in a single pass?