Closed savchukoleksii closed 3 years ago
The underlying API package has a in-momory (array) store by default for tracking rest/graph calls and limits.
You could implement the interface for the store and set up a central spot for it to dump the data, like redis.
As a note, since this is not an issue per-say, please move to discussions tab if possible
@osiset Moved into discussions https://github.com/osiset/laravel-shopify/discussions/694. Can you move your comments also? Issue closed.
Problem
Queue with multiple workers that process many jobs that works with Shopify API (or other) must deal with API Race Conditions. For example: Laravel Horizon started with 1000 workers, we dispatch many jobs that works with Shopify API into queue and also related to 1 API key (for example, 500 jobs). Horizon will process jobs in parallel without prevention only 1 job running per 1 shop. Desired behavior to keep 1 job in the same time per API key or multiple, but without exceeding Shopify Rate Limitter. Also to prevent long running processes that freezes in API retries.
Solutions
Keep 1 worker per shop - This is very bad idea, because we dont know how many tenants will use our app as also it vary fast will exceed server memory.
Limit queue to process only 1 job per API key. This better solution, but currently Laravel does not have solution for this.
Possible solution, but it need review and improvement
The idea of this solution is using Laravel
Redis::funnel
for prevent many jobs work with Shopify API, but in this case we can not define how many attemps and retries job execute. To come over this problem we can use Laravel featureretryUntil
in job.Example solution:
Job:
So, we will keep only 1 running job per API Key, but in this case all jobs must be atomic and not depends on each other, because during retries they can be reordered.
Open for discussions