Closed m3nu closed 1 year ago
Makes sense. May be a bit before I can circle around to look into. In the meantime PRs welcome. There's a couple other providers (don't remember which offhand) that have some retry behavior. Some problem come from client libs (e.g. I think NS1,) others will be implemented in the provider itself.
Ah, took a look at the documentation you linked:
For authenticated requests you can make up to 2400 requests per hour. For unauthenticated requests, you can make up to 30 requests per hour.
Based on the per-hour nature of their rate limiting implementation I don't think it'd make sense to try and catch and handle them. It would be easy enough, but the sync would have to hang and sit for a hour to get to the point where the rate limit reset. In that case I think it'd be preferable to abort and let the use try again later.
I’d prefer to start the import and see it finished. Even if it takes longer. Managing the limit manually is pretty tiring. I kept missing the refresh interval and it took me longer than needed.
This Twitter lib has a parameter sleep_on_rate_limit
, which is similar to what I suggest. https://python-twitter.readthedocs.io/en/latest/rate_limits.html
Aborting also means downloading all entries again, which can take a few hundred calls in itself.
This is just a nice-to-have suggestion, btw. Accepting an error when there are more than ~2000 new records and running it again is fine for most one-off cases.
This Twitter lib has a parameter
sleep_on_rate_limit
, which is similar to what I suggest. https://python-twitter.readthedocs.io/en/latest/rate_limits.html
That seems like a reasonable approach. It can default to False
as hanging for an hour in most automated/maintenance cases wouldn't be desirable. So long as it's documented in the README it should be easy enough to find for people who want it.
This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 7 days.
Currently an error is thrown when hitting the rate limit. Would be nice to just sleep and keep retrying.
requests.exceptions.HTTPError: 429 Client Error: Too Many Requests for url: ...
Docs: https://developer.dnsimple.com/v2/#rate-limiting