uchicago-mobi / 2015-Winter-Forum

8 stars 1 forks source link

Rate Limits & Grading #98

Closed johnnypaper closed 9 years ago

johnnypaper commented 9 years ago

I have just a few questions:

1) Relative to getting all of the issues, as I mentioned in an earlier post, I had to continue querying and appending an array until the data returned was nil. This was the only way I could figure to do it because I haven't been able to get the since query parameter to work. (Has anyone been able to retrieve all of the issues in a single query using since or another method?) It this the way it should be done? The reason I ask is because as the issues grow by 30, it requires another count against the rate limit additional to all of the others to grab the issues. This is extremely annoying that the limit is at 60 queries per hour. It makes testing irritating and uber-time-consuming (get caught once at the limit and you'll see what I mean).

2) Relative to the previous question, this could be a major source of irritation for the graders, because their IP address will be getting recorded while they are grading and will potentially have to grade these over the course of several hours/days waiting for the rate limit to refresh. So...that said, what is the plan?

I have looked at the rate limit increase and can't bring myself to rip off GitHub. Facebook or Twitter, all day. Not GitHub because I care about it. So...

Limiting this to 1 query for results of 30 with commented out code on how to do it for all issues? I don't know what the answer is, but it seems there should be a solution so that testing and grading can both be relatively smooth.

tabinks commented 9 years ago

It is fine to just use the results that come from a single networking call. You do not need to aggregate all the results.