Closed chshersh closed 1 year ago
Im happy that the structure of the fetching and actual downloading is the way it is in this change. It means actual decisions can be made based on the status of tools in the configuration.
This might also enable a prediction about how many of the configured tools can actually be downloaded since there is a rate limit of 60 requests per hour. I think the hard limit we run into right now is 30 tools total. There is 1 api call to find the id of the tag and then 1 call to actually download it.
It means actual decisions can be made based on the status of tools in the configuration.
Indeed, that's the plan 🙂
This might also enable a prediction about how many of the configured tools can actually be downloaded since there is a rate limit of 60 requests per hour. I think the hard limit we run into right now is 30 tools total. There is 1 api call to find the id of the tag and then 1 call to actually download it.
If you use a GitHub token, the limit is actually 5000 (and not 60). I believe we have some capacity until people need more than 2500 tools 👍🏻
Turns out my GITHUB_TOKEN
had a space in it :sweat:.
Here is the final GIF after applying suggestions in this PR:
Resolves #33
This PR changes the fetching algorithm significantly in an attempt to improve the final output. Specifically,
A single GIF worth a thousand words: