Azure / azure-functions-durable-js

JavaScript library for using the Durable Functions bindings
https://www.npmjs.com/package/durable-functions
MIT License
132 stars 48 forks source link

ERR_FR_MAX_BODY_LENGTH_EXCEEDED when passing 5MB file #360

Open mihaiLucian opened 2 years ago

mihaiLucian commented 2 years ago

Describe the bug I want to build a durable function in typescript. This function will take a big json file(it can go between 0-20 MB) and then do some processing with the data from the JSON file.

I use postman to provide a form-data with my file and start the orchestrator. Below exception is thrown: Exception: Error [ERR_FR_MAX_BODY_LENGTH_EXCEEDED]: Request body larger than maxBodyLength limit Stack: Error [ERR_FR_MAX_BODY_LENGTH_EXCEEDED]: Request body larger than maxBodyLength limit

Investigative information

If deployed to Azure App Service

If you don't want to share your Function App name or Functions names on GitHub, please be sure to provide your Invocation ID, Timestamp, and Region - we can use this to look up your Function App/Function. Provide an invocation id per Function. See the Functions Host wiki for more details.

To Reproduce Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

While not required, providing your orchestrator's source code in anonymized form is often very helpful when investigating unexpected orchestrator behavior.

Expected behavior A clear and concise description of what you expected to happen.

Actual behavior A clear and concise description of what actually happened.

Screenshots If applicable, add screenshots to help explain your problem.

Known workarounds Provide a description of any known workarounds you used.

Additional context

codeHysteria28 commented 2 years ago

Hey. And what are you actually doing with the file ? Saving to Azure Storage ? If so, then you will need to break the file into chunks.Assume that you try to write more than 4 megabytes (MB) of data to a file on Azure Storage, or you try to upload a file that's larger than 4 MB by using the SAS url (REST API) or HTTPS PUT method.

Could you maybe describe the whole scenario, please ? Thank you

ghost commented 2 years ago

This issue has been automatically marked as stale because it has been marked as requiring author feedback but has not had any activity for 4 days. It will be closed if no further activity occurs within 3 days of this comment.

davidmrdavid commented 2 years ago

Removed label so this issue won't auto-close. This is a legitimate report, just low-priority given that large inputs are not "best practice" in DF

ankuratudemy commented 2 years ago

We are facing the same issue. It's sort of related to https://github.com/Azure/azure-functions-durable-js/issues/93

Our use case is to trigger durable function with "type": "blobTrigger",

We have excel uploads with thousands of rows. Can we have maxBodyLength: Infinity,. set along with

maxContentLength: Infinity, in this

https://github.com/Azure/azure-functions-durable-js/commit/1dbfb2859d7c29245d82c19b1d66386b834c9f1e

ankuratudemy commented 2 years ago

Removed label so this issue won't auto-close. This is a legitimate report, just low-priority given that large inputs are not "best practice" in DF

Hi, Is there a workaround possible ? Any suggestions on how we can pass for example big JOSN files to be process by activity in loop ?

davidmrdavid commented 2 years ago

Hi @ankuratudemy: if you have this data in blob storage, you could just pass in a data identifier to your activities, and load the data in the activities themselves.

What matters it to ensure that large objects are not returned from activities, and also not used inside DF APIs (such as callActivity) within orchestrators.

5jt commented 7 months ago

Removed label so this issue won't auto-close. This is a legitimate report, just low-priority given that large inputs are not "best practice" in DF

In 2022 the OP intended to pass the file body straight to Orchestration. Easy to understand that large values in DF calls are not best practice. But “large inputs” are part of the point of DF Apps, so a 5MB (or 50MB) POST to the DF Client seems quite reasonable.

Avoiding large inputs in the call to DF Orchestration, and working entirely through REST APIs, the workflow (A) would be

  1. Remote Client POSTs file to DF Client function through API
  2. DF Client writes to Blob storage, gets file ID
  3. DF Client passes file ID to DF Orchestration
  4. DF Client returns follow-up URLs as JSON
  5. DF Orchestration chews the file, writes results to Blob
  6. Remote client waits, downloads results, triggering Blob cleanup

The workaround (Workflow B), beginning with an upload to Blob, replaces steps (1–2) with

  1. Remote client requests Shared Access Signature from API (new endpoint)
  2. Azure Function generates SAS and returns
  3. Remote client POSTs file to Blob with SAS, gets file ID
  4. Remote client passes file ID to DF Client through API

An extra endpoint, two more steps for the remote client, and the SAS leaves Azure. Workflow A looks the better design.

But still in 2024 the DF Client chokes on the POST, not the call to DF Orchestration. Is there an answer yet to the OP?