Splamy / ScoreSaberEnhanced

Provides additional features to the ScoreSaber website
MIT License
31 stars 11 forks source link

Adding A User to my score cache/updateing score cache not working #15

Closed kingwayzuh closed 4 years ago

kingwayzuh commented 4 years ago

When ever i go to update my cache it starts to load it but stuck at around 80 pages im not sure if i did something to mess it up or not dfjgdjhdgj

Splamy commented 4 years ago

It seems like they added a new rate limit recently at around 70-80 req/min (or less) I haven't had the time to test yet. I'll try to find out by how much we have to slow it down and push a fix then.

tchesket commented 4 years ago

@kingwayzuh Hey I've created a hacky little work-around for this problem, I'm sure it is not the best way to do it but I don't know how to program and at least it works, if you're interested:

//these 3 should be lines 895-897

        if ((!force && has_old_entry) || recent_songs.meta.was_last_page) {
            break;
        }

//insert the following 4 lines after those 3 in the tampermonkey script:

        if (page % 40 === 0) {
            SseEvent.StatusInfo.invoke({ text: `too many pages, sleeping for 25 seconds...` });
            await sleep(25000);
        }
Rockster34 commented 4 years ago

Even after applying this fix it just gets stuck on it's own because it tries to update a page that doesn't exist as shown in the image below I have no idea how this is happening tho image

Splamy commented 4 years ago

Hey there, sorry for the late response, I was a bit busy and forgot about this issue. @Rockster34 well it's normal that the script tries to fetch site n+1 once since there is no indicator in the api which page is the last, so I'm just fetching till i get an empty page. It shouldn't get stuck tho so I'll try to look into that today.

@tchesket Thanks for the change suggestion (I've seen your pull request). I have made some request test a while ago and I don't think 25sec each 40 pages is enough, because that would be 50sec for 80 pages and this is still way to fast for the table I have below (I might be wrong tho).

Requesting user [2538637699496776]; has 243 pages

Stopped at req no. Time RPM OK
80 27 180
80 40 120
134 90 90
138 98 85
ALL 183 80
ALL 217 70
ALL 244 60

They are using some kind of dripping bucket rate limit, where you get a few request tokens per second while you are requesting, but once you used up all tokens you have to wait about one minute at least. My idea would be to burst the first <80 requests and then (or gradually) step down to a slow request pace to finish the rest. This would make updates with less pages still very quick and full/new scans would not take longer than needed

Splamy commented 4 years ago

I just found out that the new scoresaber api uses cloudflare and it tells exactly how much you can request in the response headers:

x-ratelimit-limit: 80
x-ratelimit-remaining: 79
x-ratelimit-reset: 1594669651

I'm using that data now to request while allowed and wait the exact time till the limit is reset. This should be the optimal requesting thoughput. It will also now retry a few times if it hits a rate limit but that shouldn't happen normally.

tchesket commented 4 years ago

It is possible that I made some other small changes in my script that I forgot about that would have affected it working or not, I was tweaking a few things here and there, I thought I had undone all of them except the lines I mentioned but I may have missed something, cause my version was working perfectly fine for users with arbitrarily large numbers of pages. Ex. I used it to successfully add Taichidesu to my score cache. But alas, it doesn't really matter now since you fixed it 'properly'. Thanks btw