Closed schneefux closed 7 years ago
Hey @schneefux You're confusing filter parameters with paging. It's two different things.
We already support and currently provide pagination links in all API responses. All good!
With that said, we consider any request with longer a 28 day window invalid because the range is to large. This isn't weird, or unusual, its fairly common to provide both defaults an limitations on query parameters.
I think where things are going wrong for you, is that you're trying to pull all data for a player ever. If that's the case, as you noted, you'll need a loop within a loop.
I don't think this is really that much code, only a few lines of it has anything to do with pagination. The rest of it is formatting the request, and dealing with errors.
PS: If it was me... I'd do this differently. I'd keep an array or queue or something of requests, and just push entries onto it. First I'd push one query for each month you want to search. Based on the response, if a pagination link was provided, I'd push that onto the queue too. Really small amount of code. I think part of the problem is the way you're broken up the various services.
What else is the use case of the API if it's not getting all the data for a given set of filters in a specific time period? Dates shouldn't be a filter on your end, the entire pagination logic should be based on time series - that is how you handle the data retention, that is how you shard your data, that is what clients are interested in, etc. pp. In my opinion, the "pagination pagination", i. e. page[limit] and page[offset] should be replaced by a dynamic time series cursor. So that your query does not look like this (pseudo-SQL):
SELECT * FROM matches_eu WHERE createdAt BETWEEN filterCreatedAt-start AND filterCreatedAt-end LIMIT pageLimit OFFSET pageOffset
but is instead
SELECT * FROM matches_eu WHERE createdAt BETWEEN createdAt-start AND createdAt-end LIMIT 50
with createdAt-start and createdAt-end being a moving time window between filter[createdAt-start]
and filter[createdAt-end]
, passed as state to the client in the links
object.
You don't risk needing large offsets, which is better for performance and removes client side restrictions. (I'm making assumptions about your database based on your past statements here and in the Discord chats.)
What does that mean? Should I query 4 * ~30 days if I want all data? Or do you keep 4 full month shards? Or do you keep only 3 and a half if the current month isn't over yet? Will you change the behavior one day?
How do I implement the logic in my client? How do I keep my code current?
In combination with #249, #248, #271 and #236, my code is turning into a mess. It should be KISS. For reference, here are the relevant parts of my code:
payloads
is forwarded to a microservice that uses the payload to call/matches
:api.request
:That is 100 lines just to get a few matches from the API. (Granted, a lot of it is logging, comments and abstraction.)
Can you consider paginating the responses server side using the
links
attribute? That gives you control over the number of shards a single request hits and you can prevent clients provoking 500s and wasting requests with broken pagination logic. Instead of erroring if a request exceeds the 28 day limit, you can paginate it, which removes another quirk of the API.