Open bogdanrn opened 3 years ago
Hi, thx @bogdanrn for posting this. This is issue i wanted to post but for some reason it wasnt possible, so I asked my friend @bogdan to post the issue. Can someone explain me btw is there is some limit of issues per user or am i blacklisted for some reason?
Hi @bogdanrn, @pociej thanks for posting about this! These are valid points, we are currently looking into it. @pociej, I'm not quite sure why you wouldn't be allowed to post 🤔 we definitely don't have a limit or blacklist. Let me know if this is still an issue for you!
Hi @mllemango any update on this?
Thanks Shopify for providing this SDK. really a life changer for someone like me who built the implementations by hand. No is there any updates about this matter ? thanks !
Hi @mllemango any update?
Hey everyone, we haven't had a chance to get back to this one yet. We'll keep this issue open for others to report the same problem and that will help us prioritize.
Any update on this?
same problem
Hi, could we please prioritise this issue, it is very serious & might be affecting a lot of people using the library, thanks.
This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.
not stale
This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.
not stale
This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.
This should be implemented using a shared queue across all running shopify clients using the same key. (redis for example)
This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.
not stale
In addition to this problem, its also worth pointing out that Rest Resources don't support retries via tries
parameter where all the clients do.
Rest Resources are FAR superior for DX but a simple timeout is enough to kill long running, heavily rate limited batch process...
FetchError: request to https://{{shop}}.myshopify.com/admin/api/
2023-07/products/7996390768855.json failed, reason: read ETIMEDOUT
Code: ETIMEDOUT
@mllemango - if this is going to be fixed, it would be great if Rest Resources weren't left out of the party!
FYI - I've managed to work around this and rate limiting concerns using a combination of ts-retry-promise
and p-queue
, which is working quite nicely (albeit with clumsy syntax)
I would love to see these libraries baked into the library itself so that rate limits and transient network failures aren't things that I have to concern myself with as a consumer of this library!
import { retry } from 'ts-retry-promise';
import PQueue from 'p-queue';
const shopifyRateLimitQueue = new PQueue({
concurrency: 1, // We are allowed 2 on the basic Shopify Plan, but we don't want to exhaust the limit in case another script is running
interval: 1000,
throwOnTimeout: true,
autoStart: true,
});
await retry(
async () =>
shopifyRateLimitQueue.add(async () =>
product_listing.saveAndUpdate()
),
{
retries: 3,
backoff: 'EXPONENTIAL',
logger: console.warn,
};
);
This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.
+1 to fix this as the current internal logic of the library is useless for all but the most basic of cases and leads developers into a false sense of trust in the library
This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.
Bumping this...
This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.
There is no leaky bucket implemented in this library. Why even have the retry?
Hi folks 👋
Thank you for your patience on this.
This is an area we do want to improve for the libraries, and you are bringing up valid concerns. Transparently this is not something we will be tackling immediately.
If this is a feature that is a top priority for you please let us know, this will help us prioritize our future work.
We're labeling this issue as stale because there hasn't been any activity on it for 60 days. While the issue will stay open and we hope to resolve it, this helps us prioritize community requests.
You can add a comment to remove the label if it's still relevant, and we can re-evaluate it.
Overview/summary
Shopify API rate limiting is based on leaky bucket. However library implementation relies on very naive ‘retry’ strategy without caring about leaky bucket and requests queuing so in edge cases it can end up with serious problem, even requests that will never be called. Two cases described below : Here is most important part of https://github.com/Shopify/shopify-node-api/blob/main/src/clients/http_client/http_client.ts#L128-L141
RestAPI As leaky bucket is built based on number of request this is almost ok implementation but only ‘almost’. As there is no queuing just ‘setTimeout’ it is still possible that if request A fails and is waiting till for retry there is another request B that will be made just before retry of A and therefore retry of B will fail again, in edge case it can be never called. Solution is simple : there should be queue of requests, when request fails because of rate limit then it is added at the end of queue of waiting requests, and library then makes request one by one from queue.
GraphQL Actual implementation ignores leaky bucket mechanism and, if i didn't miss anything just retries after one second. Solution would be as above queue but this time bases on the throttle status returned by graphQL API.
Motivation
...
Area
Area: <area>
labels to this issueChecklist