Closed pragnagopa closed 4 years ago
Yes, @fabiocav and @paulbatum and I have been discussing this issue.
Can you please publish what the current limit is somewhere while you work out what it should be?
Your azure functions cli starts erroring out at 65536 bytes, but regular Azure handles files much larger than that (I've tested up to 5MB). Other posts here suggest the limit is around 15MB, but there does not appear to be an official, published number.
Meanwhile, Google "AWS Lambda Limits" and the summary data tells you the limits and the first result has all the details you'd need to know. http://docs.aws.amazon.com/lambda/latest/dg/limits.html
I almost had a heart-attack when I got the error message back from the azure functions cli indicated there was a limit at 65536. One of the reasons I moving my stuff from Lambda to Azure Functions is because you appeared to have a larger file size limit. I was glad when the "REAL" limit was higher, but this is also a real pain. Local dev is a ton faster than pushing to Azure every time, especially with your uber slow file-system. This is yet another reason why I can't use the CLI for local dev because it doesn't mimic the real thing.
I needed to be able to process files up to 10MB, and the storing in blog storage and pulling it back was kind of excessive and ran into its own data retention problems.
The Azure Functions runtime (on cloud) currently has a 15MB limit on the request payload size. We will likely be increasing that limit to at least match Logic Apps (which is currently at 50MB). We'll keep this issue updated with any new information.
@fabiocav Is the limit the same on the response as it is the request?
Sorry, I'm getting a 502 error message on something and I'm trying to figure out what might be going on.
Hi @fabiocav,, what advice can you offer me to circumvent this when using VS 2015 tooling? Every so often I'm hitting this limit with this error message when using a POST httpTrigger:
System.ServiceModel: The maximum message size quota for incoming messages (65536) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element.
Warmest regards,
Garrard.
@garrardkitchen the next CLI release will match the runtime limit, which was just bumped up to 100MB. You can see more information about the change here.
Hi @fabiocav, I appreciate the confirmation. I'm assuming there's no date yet for next the CLI release?
No concrete ETA, but it should be happening soon. @lindydonna might be able to share more information.
@garrardkitchen We usually do the CLI releases at the same time as the portal release, but we're a little behind in this case. You can manually build the CLI as a workaround, but we should have a new release in a week or so.
Thank you @lindydonna @fabiocav .
Any update on this? I'm still hitting the limit with a mere 20MB file, testing them locally.
I'm not aware of anything outstanding here. I think the issue was just not closed appropriately. Have you made sure you are running the latest bits?
@CKeene78 Have you tried it on the live site? Its been my experience that local Azure Functions has a MUCH lower limit than the real thing. Its frustrating its not consistent, but last time I checked (about a month ago) it was still different.
I believe this ticket is specific to the version actually used by Azure Functions verses the CLI.
@paulbatum Confirmed latest bits.
@securityvoid Well, works on the Live Site live! Did notice that the Live Site function is utilizing v1 runtime, while my local client is utilizing v2 (due to other projects). Unsure if that is part of the problem.
@CKeene78 You'll definitely run into issues if your local tools are out of sync with the environment you're deploying to. You might want to have multiple versions of the core tools installed so you can easily switch between then (I am not an expert on this but I would guess that you can't have multiple versions installed globally as I don't think NPM supports this).
@fabiocav Just a stupid question but are there any ways to manually tweak the Maximum request length to a size greater than 100mb?
A client of mine is posting CSV files that may exceed this limit? At the moment I just told him to split the large files up into multiple files, but it would be easier for us to receive the files as a whole.
I have tried to google my way and based on the azure function documentation from Microsoft: "functions-bindings-http-webhook"
it was stated that
The HTTP request length is limited to 100 MB (104,857,600 bytes), and the URL length is limited to 4 KB (4,096 bytes). These limits are specified by the httpRuntime element of the runtime's Web.config file.
Yet I am unable to actually locate the Web.config within my solution so I expect this is not really applicable or am i just totally off here?
@two2tee You're not totally off. In the case of functions you can't configure this (thats what this issue tracks). Our general recommendation is to switch to a flow where large files are uploaded to blob storage which are then processed by your functions.
Hi, as stated in our Azure ticket 119021324001042, the approach of moving into a blob storage upload and then process is not something we can do. Is there an ETA on either this feature or rising the limit further than 100mb? Thanks.
@juanm-pereyra can you elaborate on why the client can't upload to blob storage, using a SAS token? In both cases its a HTTP request. All I can really suggest is that if you absolutely must have a HTTP endpoint that can accept requests larger than our current limit, you'll need to use something else to implement that endpoint (such as ASP.NET or node.js), or deploy your own custom build of functions with a modified web.config as a private site extension (its not possible to do this in the consumption plan).
Hi Paul, sure, we're implementing JWT authentication that has claims pertinent to what can be done with the file, so in short we can't separate the request context from the file processing without doing something hacky.
Sounds like we're going to have to plan to move out of Azure Functions then, at least for the HTTP triggers (which to tell you the truth, is kind of sad, having to ditch the serverless approach of the functions because of a platform limitation).
Thanks.
On Wed, Feb 20, 2019 at 2:47 PM Paul Batum notifications@github.com wrote:
@juanm-pereyra https://github.com/juanm-pereyra can you elaborate on why the client can't upload to blob storage, using a SAS token? In both cases its a HTTP request. All I can really suggest is that if you absolutely must have a HTTP endpoint that can accept requests larger than our current limit, you'll need to use something else to implement that endpoint (such as ASP.NET or node.js), or deploy your own custom build of functions with a modified web.config as a private site extension https://github.com/Azure/azure-functions-host/wiki/Deploying-the-Functions-runtime-as-a-private-site-extension (its not possible to do this in the consumption plan).
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Azure/azure-functions-host/issues/1063#issuecomment-465683817, or mute the thread https://github.com/notifications/unsubscribe-auth/AfwCtG6pvG4dmREjl8-H1XwwP2tyO70pks5vPYo8gaJpZM4LREU6 .
--
Juan Manuel Pereyra | Dev-Ops
W: www.q.media \ TW: twitter.com/qdotmedia \ LI: linkedin.com/company/qdot-media https://www.linkedin.com/company/qdot-media/
[image: http://goo.gl/vVMsX4)] https://www.linkedin.com/company/qdot-media/
@juanm-pereyra Have you considered the following approach:
Hi Paul,
Yes, but for many of our clients, this is the only feature they use, so the flow of:
is not acceptable at all. Plus all automated application flows we already have, would have to be modified, lot's of them on 3rd parties. On a side note, I suppose one of the purposes of serverless cloud platforms would be to hide the complexities of scaling and server performance from the clients, and while I agree some boundaries and thresholds have to be set, I think 100mb of POST size is a really low one, considering how many apps will really even get near that limit.
On Thu, Feb 21, 2019 at 3:31 PM Paul Batum notifications@github.com wrote:
@juanm-pereyra https://github.com/juanm-pereyra Have you considered the following approach:
- create a HTTP endpoint that implements your JWT authentication, and returns a blob SAS when claim validation is successful
- client uses blob SAS to upload to storage
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Azure/azure-functions-host/issues/1063#issuecomment-466112863, or mute the thread https://github.com/notifications/unsubscribe-auth/AfwCtDkm47tTCmXDMZwkLjowp8vRHxfDks5vPuYRgaJpZM4LREU6 .
--
Juan Manuel Pereyra | Dev-Ops
W: www.q.media \ TW: twitter.com/qdotmedia \ LI: linkedin.com/company/qdot-media https://www.linkedin.com/company/qdot-media/
[image: http://goo.gl/vVMsX4)] https://www.linkedin.com/company/qdot-media/
@juanm-pereyra Just a thought, would an HTTP-triggered Azure Durable Function + Event Grid Trigger work for this scenario?
Hi,
That suggestion would follow the same flow I described before, which is not acceptable.
EDIT: I think I misunderstood initially what you meant, reading it a second time I realized that you no longer return the JWT and then ask for SAS, but again that would involve instead of the clients hitting our Login service, but instead a second service that will get the JWT (with login credentials) then validate what the client wants to do using the JWT and getting the SAS for it... again... on a first impression, responsibility separation for services there is quite compromised, as you're hitting an endpoint with credentials that will only proxy them to another Login service and then realize it's true purpose that is validating the claims of the request upon a file that is not even included in said request to generate a SAS token. All of this to circumvent a platform limitation. I'd actually prefer moving out of said platform if there's no other way.
On Fri, Feb 22, 2019 at 12:22 PM CKeene78 notifications@github.com wrote:
@juanm-pereyra https://github.com/juanm-pereyra Just a thought, would an HTTP-triggered Azure Durable Function + Event Grid Trigger work for this scenario?
- Hit durable endpoint
- Durable requests + receives the JWT authentication
- Durable takes the JWT authentication and requests the SAS authentication
- Durable returns the SAS authentication token to client
- Client utilizes the SAS token to upload their file to BLOB storage
- An Event Grid trigger (on new BLOB file detected) fires and processes the file.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Azure/azure-functions-host/issues/1063#issuecomment-466433883, or mute the thread https://github.com/notifications/unsubscribe-auth/AfwCtMcZEauc4egpeSEmqtIiLgKbtp9hks5vQAtPgaJpZM4LREU6 .
--
Juan Manuel Pereyra | Dev-Ops
W: www.q.media \ TW: twitter.com/qdotmedia \ LI: linkedin.com/company/qdot-media https://www.linkedin.com/company/qdot-media/
[image: http://goo.gl/vVMsX4)] https://www.linkedin.com/company/qdot-media/
Got it. For context, we generally recommend these approaches because they end up being:
Of course I appreciate that these approaches have an impact on the client and this can be problematic if you're trying to use functions as a drop in replacement for an existing API that a bunch of clients are already talking to.
Just reviewed a case that wanted to use a URL query parameter that resulted in > 4kb URL. I believe this is the issue tracking it. Adding here as a record and labeling to triage later (case # 119052223001602)
I have exactly the same problem as described by @juanm-pereyra above. Sad to be forced to ditch azure function just because of this limit when JWT token is used. Our clients work with 3D CAD files that may exceed 100 mb from time to time. Would love to see that addressed somehow. But ok, will probably need to run a separate node server for that as from the discussion above it's not clear if this limit will ever be lifted.
Ditto here. I'd love to be able to use functions for my use case (large historical calculation engine) but the 100MB limit is a deal breaker.
Closing this issue as it is quite stale and the original issue is no longer a problem (the limitation is no longer the same).
@jabbera can you please open a separate issue with details about your scenario? We do have some guidance we might be able to provide.
Happy to! thanks!
When input exceeds 25Mb, HttpTrigger fails with Maximum request length exceeded error. Similar question on SO
Repro steps
Provide the steps required to reproduce the problem
Create HttpTrigger-CSharp
Invoke with input string >25MB
Expected behavior
Expose option to configure Maximum request length
Actual behavior
Fails with error:
Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.HttpTriggerCSharp1 ---> System.InvalidOperationException: Exception binding parameter 'req' ---> System.Web.HttpException: Maximum request length exceeded. at System.Web.HttpBufferlessInputStream.ValidateRequestEntityLength() at System.Web.HttpBufferlessInputStream.GetPreloadedContent(Byte[] buffer, Int32& offset, Int32& count) at System.Web.HttpBufferlessInputStream.BeginRead(Byte[] buffer, Int32 offset, Int32 count, AsyncCallback callback, Object state) at System.Web.Http.NonOwnedStream.BeginRead(Byte[] buffer, Int32 offset, Int32 count, AsyncCallback callback, Object state) at System.Net.Http.StreamToStreamCopy.StartRead() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.WebJobs.Script.Binding.HttpTriggerAttributeBindingProvider.HttpTriggerBinding.d__15.MoveNext()