Shopify / shopify-app-js

MIT License
262 stars 101 forks source link

leaky bucket rate limiting does not keep order of calls #779

Open bogdanrn opened 3 years ago

bogdanrn commented 3 years ago

Overview/summary

Shopify API rate limiting is based on leaky bucket. However library implementation relies on very naive ‘retry’ strategy without caring about leaky bucket and requests queuing so in edge cases it can end up with serious problem, even requests that will never be called. Two cases described below : Here is most important part of https://github.com/Shopify/shopify-node-api/blob/main/src/clients/http_client/http_client.ts#L128-L141

Motivation

What inspired this enhancement?

...

Area


Checklist

pociej commented 3 years ago

Hi, thx @bogdanrn for posting this. This is issue i wanted to post but for some reason it wasnt possible, so I asked my friend @bogdan to post the issue. Can someone explain me btw is there is some limit of issues per user or am i blacklisted for some reason?

mllemango commented 3 years ago

Hi @bogdanrn, @pociej thanks for posting about this! These are valid points, we are currently looking into it. @pociej, I'm not quite sure why you wouldn't be allowed to post 🤔 we definitely don't have a limit or blacklist. Let me know if this is still an issue for you!

pociej commented 3 years ago

Hi @mllemango any update on this?

mgara commented 2 years ago

Thanks Shopify for providing this SDK. really a life changer for someone like me who built the implementations by hand. No is there any updates about this matter ? thanks !

pociej commented 2 years ago

Hi @mllemango any update?

mllemango commented 2 years ago

Hey everyone, we haven't had a chance to get back to this one yet. We'll keep this issue open for others to report the same problem and that will help us prioritize.

bluedge commented 2 years ago

Any update on this?

mariusa commented 2 years ago

same problem

Volcantis commented 2 years ago

Hi, could we please prioritise this issue, it is very serious & might be affecting a lot of people using the library, thanks.

github-actions[bot] commented 1 year ago

This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.

mariusa commented 1 year ago

not stale

github-actions[bot] commented 1 year ago

This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.

Zelfapp commented 1 year ago

not stale

github-actions[bot] commented 1 year ago

This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.

mgara commented 1 year ago

This should be implemented using a shared queue across all running shopify clients using the same key. (redis for example)

github-actions[bot] commented 1 year ago

This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.

pociej commented 1 year ago

not stale

deldrid1 commented 1 year ago

In addition to this problem, its also worth pointing out that Rest Resources don't support retries via tries parameter where all the clients do.

Rest Resources are FAR superior for DX but a simple timeout is enough to kill long running, heavily rate limited batch process...

 FetchError: request to https://{{shop}}.myshopify.com/admin/api/
    2023-07/products/7996390768855.json failed, reason: read ETIMEDOUT
    Code: ETIMEDOUT

@mllemango - if this is going to be fixed, it would be great if Rest Resources weren't left out of the party!

deldrid1 commented 1 year ago

FYI - I've managed to work around this and rate limiting concerns using a combination of ts-retry-promise and p-queue, which is working quite nicely (albeit with clumsy syntax)

I would love to see these libraries baked into the library itself so that rate limits and transient network failures aren't things that I have to concern myself with as a consumer of this library!

import { retry } from 'ts-retry-promise';
import PQueue from 'p-queue';

const shopifyRateLimitQueue = new PQueue({
  concurrency: 1, // We are allowed 2 on the basic Shopify Plan, but we don't want to exhaust the limit in case another script is running
  interval: 1000,
  throwOnTimeout: true,
  autoStart: true,
});

await retry(
  async () =>
    shopifyRateLimitQueue.add(async () =>
      product_listing.saveAndUpdate()
    ),
  {
    retries: 3,
    backoff: 'EXPONENTIAL',
    logger: console.warn,
  };
);
github-actions[bot] commented 12 months ago

This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.

deldrid1 commented 12 months ago

+1 to fix this as the current internal logic of the library is useless for all but the most basic of cases and leads developers into a false sense of trust in the library

github-actions[bot] commented 10 months ago

This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.

deldrid1 commented 10 months ago

Bumping this...

github-actions[bot] commented 8 months ago

This issue is stale because it has been open for 90 days with no activity. It will be closed if no further action occurs in 14 days.

ChristopherCapito commented 5 months ago

There is no leaky bucket implemented in this library. Why even have the retry?

lizkenyon commented 4 months ago

Hi folks 👋

Thank you for your patience on this.

This is an area we do want to improve for the libraries, and you are bringing up valid concerns. Transparently this is not something we will be tackling immediately.

If this is a feature that is a top priority for you please let us know, this will help us prioritize our future work.

github-actions[bot] commented 2 months ago

We're labeling this issue as stale because there hasn't been any activity on it for 60 days. While the issue will stay open and we hope to resolve it, this helps us prioritize community requests.

You can add a comment to remove the label if it's still relevant, and we can re-evaluate it.