Open ghost opened 5 years ago
Collect all of the API requests into manageable chunks of pagination or whatever.
Feed those requests into a queue (like, Bull / Redis). Use the queue ratelimiter to ratelimit requests going to Twitter. Write a worker to process the requests as they come back.
Hi,
I need to fetch large amounts of data. I am fetching tweets from every follower of a specific user, the user has over 1 million followers.
I have multiple twitter apps is it possible to handle rate limiting by using these multiple api tokens. For example when one is about to reach the limit to exchange the next request to use the next api token.
Is there an example of this?