amzn / selling-partner-api-models

This repository contains OpenAPI models for developers to use when developing software to call Selling Partner APIs.
Apache License 2.0
590 stars 728 forks source link

[SP-API][FEEDS API | LISTING ITEMS API][JSON FEED][RANT?][HELP?][SOS?] How will we be able to update our inventory and stock in 2025? Is this a joke? #572

Closed maximehinnekens closed 3 months ago

maximehinnekens commented 5 months ago

Now that the flat feeds and XML feeds are being removed, how will we ever be able to manage our inventory and pricing? We have millions of listings and while following the best practices, we will need literally a week if we want to do just one simple inventory or pricing operation with the 'bulk' JSON listing feeds. The most basic and crucial operation there is for ANY marketplace api I have seen so far.

These are my main concerns (sorry for the silly jokes, I just think this has been really an unacceptable and unresolved matter for an extended time now, and the deprecation announcement has made me completely lose my cool/mind):

  1. The rate limit header of the createFeed operation returns a rate limit that does not apply to the json feed being submitted, its somewhere hidden in the documents in a completely different section than that of the feeds api, how can this still be after all these months? Sometimes I feel like reading the documentation is like playing a game of Pictionary or Cluedo (and I have yet to see one post that disagrees with this).
  2. The rate limit is hardcoded at 1 feed per 5 minutes (shared across all the EU countries we have on Amazon) which can contain a max of 10.000 messages per feed; however these are queued for submission at a rate of 5 updates per second; a damn browser macro in the seller dashboard would be faster, was there any proper assesment of these rate limits at all? How could one shoe fits all be a good idea for an operation that can literally risk us being punished and customers dissatisfied because of its low throughput that YOU provide us?
  3. We have quite a high dynamic ratelimit for the getFeaturedOfferExpectedPriceBatch (and Orders), but like what am I supposed to do with that when I get 10's of thousands changes that should happen every minute according to it (excluding the notifications of offer changes which is even crazier) if I can only update them at 5 per second... I think I can print them faster with an InktJet printer and send the updates via the post.
  4. "The FLAT file feeds can still be uploaded via the seller dashboard after the deletion from the API". Oh great, so if the JSON feeds don't get new rate limits, I will either need to get a physical person to do a job which used to be automated because time and automation for us seems to be going backwards instead of forward, or we will need to hack our way around it with somekind of clickbot like this is RuneScape botting in 2005?
  5. In case you want to make this even more ridiculous, this is the support case I opened:

Me:

Hi,

We have one Marketplace account active in 8 countries. We are currently submitting throught he FLAT file inventory feeds and sometimes JSON feeds when we have a small batch of updates. Stock is managed with one feed for all countries as SKU is the shared identifier accross the stores. However, due to repricing by using the "Get Featured Offer Price" endpoint, we have a different price (and shipping template) for each country for one listing. With FLAT files, this is somewhat manageable as we submit large files and only submit what has actually changed.

I have seen the deprecation announcement for these Flat file feeds; this has an enormnous impact on us. In total we have around 4 million listings that have at least daily updates in price and handling time. Stock is around 500-700k sku's that need to be updated at least once daily. We have a very high dynamic rate limit for featured offer price endpoint which is useless when it will take days to update a portion of our listings only once through the JSON feeds. (we have exchange rate changes once a day which can push quite some updates). As of now, we only use the featered offer expected price for the Germany store to keep ratio of system load/sales increase the best.

Is there a possibility to increase our rate limits fro the JSON listings api to be able to update our prices at an acceptable rate?

Amazon Response:

Greetings from Amazon,

My name is Stacy, from the Selling Partner API Developer Support team.

Thank you for reaching out with regards about increasing the current limit set for feeds API. The API operation you’re looking to increase it is part of our Dynamic Usage Plan limits. The limit set on this operation will start at the default amount that you will see in the documentation for the operation in question, and can increase, or reset depending on the business needs of your services. Note, this does not count actual API requests being made. These rates can not be adjusted manually by our API teams, and the rate limit will only increase if they meet the requirements for the Dynamic API Usage Plans.

Refer to the blog 'Strategies to optimize rate limits for your application workloads': https://developer-docs.amazon.com/sp-api-blog/docs/strategies-to-optimize-rate-limits-for-your-application-workloads

Try making a batch call instead of implementing 1x1 API operations by following this blog 'Reducing API Calls with Batch Functionality': https://developer-docs.amazon.com/sp-api-blog/docs/reducing-api-calls-with-batch-functionality

Consider enabling Notifications API: https://developer-docs.amazon.com/sp-api-blog/docs/event-driven-architecture-in-selling-partner-api

Watch our video optimizing SP-API call patterns: https://youtu.be/vhLt9stnYYY

To learn more about SP-API visit: https://developer.amazonservices.com/ (https://developer.amazonservices.com/)

To watch demo/tutorials and past webinars in our SP-API Developer University YouTube channel: https://www.youtube.com/@amazon-sp-api/ (https://www.youtube.com/@amazon-sp-api/)

Please let us know how we did.

I sent them this reply but I have no high hopes for the response:

I would like to refer to this: https://developer-docs.amazon.com/sp-api/docs/building-listings-management-workflows-guide#should-i-submit-in-bulk-using-the-json_listings_feed-or-individually-with-the-listings-items-api

This documentation says that eventhough the x-amzn-RateLimit-Limit says 0.0083 for us, the rate limit is hardcoded at 1 json feed per 5 minutes (shared limit accross all our stores). 1 feed can only contain 10.000 messages. I can actually include an ongoing example. I decided to wipe all our listings that have been out of stock for a while. There are around 1.7 million listings listings to delete. I therefore created 171 json feeds of which I have been able to submit (not necessarily finished, just submitted) approximately 60% after almost 48 hours. I think about 30% of the JSON feeds are 'done'. Thats for ONLY ONE operation on 1/3 of our listings which will take almost a WEEK to execute.

When the flat files are removed and the rate limits are still not dynamic, how will we do anything? What are our options here? I don't see how any of the currently non-deprecated endpoints can be sufficient for large volumes of listings like ours. We already follow all best practices, our updates are as lean as possible, there just seems to be a limit that is supposed to be one size fits all, this makes no sense.

Thanks for followin up on this

  1. So they claim this is part of the dynamic plan, which it is not. I tested it again to make sure of it and I can confirm that the rate limit headers shows 0.0083 (121 sec for a new one in the bucket) but it will fail with 429 when submitting the JSON feed within 5 minutes of the previous one. Anyhow, the enqueued rate limit after that is the bottleneck again anyhow.
  2. The support rep claims that the limits cant be overwritten by the API team. Then WHY is it mentioned like 5 times in the documentation that if the throughput is not enough for any endpoint, an override can be requested through a support case with dev support???
  3. I refer to the ongoing deletion as an example. In most cases the speed of that won't be so critical. However, about half a year ago I was called up by the escalation team at Amazon to remove all German books for some reason with a very strict deadline, I had to stay up for 24 hours to follow up on the json feed submissions to delete them in time and I made it only just before that deadline, what an actual nightmare that was.

To any Amazon developer reading this, please fix this as this impacts our business as well as your's while affecting your own end-customers the most. The impact of this is ridiculously high and no one seems to notice or take it seriously yet.

maximehinnekens commented 5 months ago

@chapmanjw

johnkw commented 5 months ago

This is a duplicate of bug #547. It would be good to consolidate these rate issues in that original bug.

maximehinnekens commented 5 months ago

This is a duplicate of bug #547. It would be good to consolidate these rate issues in that original bug.

Your issue is just regarding the mapping to json feeds though, I don't see any problems with that at all. They have been existing and encouraged for a while now already. We also use it already since a while now, it is indeed better but just too slow. The rate limit is the issue I address here which does not seem to be your question in your report, hence not a duplicate imo

johnkw commented 5 months ago

Oh, perhaps related but not duplicate then. Are you saying that POST_FLAT_FILE_LISTINGS_DATA didn't have the 10,000 items per submission limit?

maximehinnekens commented 5 months ago

Oh, perhaps related but not duplicate then. Are you saying that POST_FLAT_FILE_LISTINGS_DATA didn't have the 10,000 items per submission limit?

It has only a size limit in kb, you have to really play around with how large you make your feeds and find the sweetspot. Im no fan of these old TSV feeds, they are sometimes a nightmare in getting updates through in the correct sequence but they process faster. The amount of 10k is just annoying, the submission rate limit is just a big question mark documentation and organization wise. Seems like it was hardcoded into it "if(feedtype = json) -> rate limit is X but still return header Y. It should at least be that of the other feeds.

THe big issue is the enqueud submissions/processing if these json feeds which only process at 5 messages per second, thats insanely slow.

maximehinnekens commented 5 months ago

@johnkw I'll give an example to help understand how slow it actually is with json feeds:

we have aprox 500k listings per country, 8 countries in total. Each country has its own price, handling time, and shipping template for the listing, so that makes for any small update (except stock), a max of 4 million listings to update (sometimes its like 5-6 million depending on the current inventory).

So let's say we decide to change our margins that translates to a fixed 0.10 euro price increase for each listing:

During that time I wont be able to update anything else besides doing full patches or deleting listings.

🥲

supoman-service commented 4 months ago

Hello, is it possible to create product information if your listing uses JSON format? The dynamic linkage attribute of jsonSchema is very fucked up

github-actions[bot] commented 3 months ago

This is a very old issue that is probably not getting as much attention as it deserves. We encourage you to check if this is still an issue after the latest release and if you find that this is still a problem, please feel free to open a new issue and make a reference to this one.

chapmanjw commented 3 months ago

@maximehinnekens (and everyone one on this thread), with your migration to the JSON-based feeds and Listings Items APIs, please contact developer support to discuss your specific throughput concerns as they relate to high volume changes (particularly pricing and inventory). We are evaluating those on a case-by-case basis to ensure effective resource utilization and to prevent unnecessary backlogs of updates that have commonly plagued high volume feed submissions in the past.

A common pattern we see from pricing and inventory workflows is the re-assertion of the same data thousands of times a day across thousands or millions of items for a given seller. As we need to process each of these submissions in the order they were received and to assert what was given, this wasteful pattern accounts for both the majority of our traffic as well as the majority of the backlogs we see in feeds processing. In these scenarios, it can take hours (and in some egregious cases days) for us to process these updates in the order they were received, when in the end they have no material change to the data on the listing. This provides a poor experience for selling partners when they do have a material change to apply that gets stuck behind hundreds or thousands of submissions for the same SKU sitting in-front of it waiting to process.

The standard limits on the JSON_LISTINGS_FEED and Listings Items APIs sufficiently cover the vast majority of submission volumes. However, as we evaluate on a case-by-case basis the high volume use-cases, we will be taking into consideration whether or not the use-cases are contributing to the overall wastefulness of re-asserting existing data or if they are actually submitting material changes at high volume. It simply does not benefit sellers for developers to continually reassert data that results in no material change.

If you do have a high-volume submission use-case that does not fit within the standard limits on the JSON_LISTINGS_FEED and Listings Items API limits, please do contact developer support. We absolutely want to work with you on right sizing the limits for valid use-cases that are not the reassertion of data resulting in no material changes.

franclin commented 3 months ago

Hi @maximehinnekens , @supoman-service On a minor note, how did you handle the subscriptions across all your accounts given that only one subscription can be registered against an AWS account? As you have got stores in 8 countries, did you create an AWS account for each store?

To effectively handle the JSON_LISTING_FEED, Amazon recommends event driven architecture and in this case, we should subscribe to the FEED_PROCESSING_FINISHED notification.

Thanks in advance for shedding some light into this.