Open mihaiLucian opened 2 years ago
Hey. And what are you actually doing with the file ? Saving to Azure Storage ? If so, then you will need to break the file into chunks.Assume that you try to write more than 4 megabytes (MB) of data to a file on Azure Storage, or you try to upload a file that's larger than 4 MB by using the SAS url (REST API) or HTTPS PUT method.
Could you maybe describe the whole scenario, please ? Thank you
This issue has been automatically marked as stale because it has been marked as requiring author feedback but has not had any activity for 4 days. It will be closed if no further activity occurs within 3 days of this comment.
Removed label so this issue won't auto-close. This is a legitimate report, just low-priority given that large inputs are not "best practice" in DF
We are facing the same issue. It's sort of related to https://github.com/Azure/azure-functions-durable-js/issues/93
Our use case is to trigger durable function with "type": "blobTrigger",
We have excel uploads with thousands of rows. Can we have maxBodyLength: Infinity,. set along with
maxContentLength: Infinity, in this
https://github.com/Azure/azure-functions-durable-js/commit/1dbfb2859d7c29245d82c19b1d66386b834c9f1e
Removed label so this issue won't auto-close. This is a legitimate report, just low-priority given that large inputs are not "best practice" in DF
Hi, Is there a workaround possible ? Any suggestions on how we can pass for example big JOSN files to be process by activity in loop ?
Hi @ankuratudemy: if you have this data in blob storage, you could just pass in a data identifier to your activities, and load the data in the activities themselves.
What matters it to ensure that large objects are not returned from activities, and also not used inside DF APIs (such as callActivity) within orchestrators.
Removed label so this issue won't auto-close. This is a legitimate report, just low-priority given that large inputs are not "best practice" in DF
In 2022 the OP intended to pass the file body straight to Orchestration. Easy to understand that large values in DF calls are not best practice. But “large inputs” are part of the point of DF Apps, so a 5MB (or 50MB) POST to the DF Client seems quite reasonable.
Avoiding large inputs in the call to DF Orchestration, and working entirely through REST APIs, the workflow (A) would be
The workaround (Workflow B), beginning with an upload to Blob, replaces steps (1–2) with
An extra endpoint, two more steps for the remote client, and the SAS leaves Azure. Workflow A looks the better design.
But still in 2024 the DF Client chokes on the POST, not the call to DF Orchestration. Is there an answer yet to the OP?
Describe the bug I want to build a durable function in typescript. This function will take a big json file(it can go between 0-20 MB) and then do some processing with the data from the JSON file.
I use postman to provide a form-data with my file and start the orchestrator. Below exception is thrown: Exception: Error [ERR_FR_MAX_BODY_LENGTH_EXCEEDED]: Request body larger than maxBodyLength limit Stack: Error [ERR_FR_MAX_BODY_LENGTH_EXCEEDED]: Request body larger than maxBodyLength limit
Investigative information
If deployed to Azure App Service
To Reproduce Steps to reproduce the behavior:
Expected behavior A clear and concise description of what you expected to happen.
Actual behavior A clear and concise description of what actually happened.
Screenshots If applicable, add screenshots to help explain your problem.
Known workarounds Provide a description of any known workarounds you used.
Additional context