Closed double-beep closed 3 years ago
I'm not sure GET requests is more efficient than using the API. Ideally we should make our usage of the API more efficient instead.
GET requests are subject to SE's website rate-limiting which is much more annoying than an API backoff, because if you get rate limited for accessing the website, you'll have to wait a few hours before being able to access it again.
It's also more resource-intensive for SE themselves for us to be doing GET requests to get this information rather than use the API which they've made purely to get the information we want.
Using the API also lets us cache the data more easily and in a more generic form so we can do it for all functions!
I think we should tackle the backoffs issue you're mentioning here rather than trying to do GET requests on the pages. How many backoffs do you get? How frequently? This is something that is definitely fixable!
I've been using GET
s for some time now: SOX is a lot faster and I've not been rate-limited!
I get backoffs when opening many tabs at once, when skipping too much in the review (sox-new-review-post-appeared
is triggered).
I'll have to look into this a bit more when I get some time, sorry for the extremely late reply!
We should probably enhance the API requests rather than directly fetching the info from the site.
As of now, 15 SOX features use the API. This makes SOX too slow and results in backoffs. While SOX has to use the API (some info can't be obtained from the site itself!) the features which use it too much should either be rewritten or the
getFromAPI
method should be removed.markEmployees
shouldGET /location.hostname/users/user-info/userid
and check forfw-normal fs-fine
classes. For example, user 51 is staff, while user 435726 is not. Here's what I have come up with (still needs improvement because it throws errors in CW posts):$('.comment-user, .user-details a').not('.sox-markEmployees-logo').each(function() { const $this = $(this); if (!$this.attr('href')) return; // deleted user const userid = $this.attr('href').split('/')[2]; $.get(
https://${location.hostname}/users/user-info/${userid}
, data => { if (!data || !data.match('fw-normal fs-fine')) return; $this.append($('', { title: 'employee (added by SOX)', }).append($icon.clone())); }) });isQuestionHot
shouldGET https://stackexchange.com/hot-questions-for-mobile
(are all HNQs there?). I've come up with:$.get("https://cors-anywhere.herokuapp.com/https://stackexchange.com/hot-questions-for-mobile", list => { isHNQ(JSON.stringify(list)); })
function isHNQ(HNQs) { if (sox.location.on('/questions/')) { if (HNQs.match(
id":${StackExchange.question.getQuestionId()},"title":"${$('h1 .question-hyperlink').html()}
)) addHotText(); } else if ($('.question-summary').length) { $('.question-summary').each(function() { if (!$(this).attr('id')) return; // not a Teams post (those don't have ids) if (HNQs.match(id":${$(this).attr('id')},"title":"${$('.question-summary a').html()}"
)) $(this).find('.summary h3').prepend(getHotDiv('question-list')); }); } }linkedPostsInline
:GET /posts/ajax-load-realtime/postid?title=true
. Title is$(html).data('title')
and HTML is$(html).find('.post-text').html()
. Same forparseCrossSiteLinks
andaddTagsToHNQs
(but fetching tags this time).editReasonTooltip
:GET
post's timeline and find through jQuery the last revision's comment.I'm in the process of rewriting these features in order to make SOX faster and stop those annoying backoffs!
Any more ideas?