Open masardee opened 5 years ago
Looks like you are being throttled by the GitLab.com API endpoint for making too many requests too quickly. I believe the throttle rate is calculated per minute, so if you're hammering "reload" during development you may be hitting that limit prematurely.
However, 30 pages of commits could be doing it on its own. If you can't get things done in one operation you have some options:
It sounds like you may be using a hosted service to do the API scraping, in which case you might not have direct access to the scraper's source code. In that case, you're going to have to engage somebody's support to find out what the best next steps are, since you either need to modify the behavior of the scraper (Stitchdata) or the behavior of the API it's connecting to (GitLab.com). Good luck.
PS - I'm just a user, I'm not especially knowledgeable about this particular codebase ... but a partially-informed answer to a question is better than no answers at all, right?
I used stitchdata.com to extract my Gitlab data and stored to other DB. There are > 100 projects defined in my gitlab.
But I always end up with "429 Too Many Requests" error when running the sync process.
This is my log :
Any one can help?