NginxProxyManager / nginx-proxy-manager

Docker container for managing Nginx proxy hosts with a simple, powerful interface
https://nginxproxymanager.com
MIT License
21.92k stars 2.53k forks source link

413 Request Entity Too Large #914

Open nfacha opened 3 years ago

nfacha commented 3 years ago

Checklist

Describe the bug File uploads trigger 413 Request Entity Too Large Setting the advanced config client_max_body_size 100m; takes no effect and the problem persists

To Reproduce Steps to reproduce the behavior:

  1. Upload a larger file on a website going thru the proxy, 413
  2. Set an appropriate client_max_body_size
  3. try to upload again
  4. See a 413

Expected behavior client_max_body_size would take effect and allow the upload as it does with regular nginx

Screenshots If applicable, add screenshots to help explain your problem.

Ey3scr34m commented 3 years ago

I'm having the same issue.

centralhardware commented 3 years ago

same issue when try to upload iso to proxmox

ichbinder commented 3 years ago

Hi, I have found a solution. you can enter client_max_body_size in the costome file server_proxy.conf, then it works. You can read here where and how to create the file.

Mano-Liaoyan commented 2 years ago

At the Edit Proxy Host window, click Custom locations and click the gear button, set the following attributes to an appropriate value

image
jenny787 commented 2 years ago

Hi, I have found a solution. you can enter client_max_body_size in the costome file server_proxy.conf, then it works. You can read here where and how to create the file.

I went into NPM docker files, I added the server_proxy.conf file and added this line: "client_max_body_size 0;"

It still doesn't work. I also tried adding it to seafhttp like the picture above and that didn't work either. I still get 413 error.

jeffryjdelarosa commented 2 years ago

I have the same issue

jenny787 commented 2 years ago

I have now tried this in all sorts of ways, and it never works. Nothing seems to change the NPM behavior. I thought it was something to configure on Seafile's side, but I tried it with different webdav servers and all of them will return 413. There is no way to fix this. Seems like there is something hardcoded in NPM.

jeffryjdelarosa commented 2 years ago

I have now tried this in all sorts of ways, and it never works. Nothing seems to change the NPM behavior. I thought it was something to configure on Seafile's side, but I tried it with different webdav servers and all of them will return 413. There is no way to fix this. Seems like there is something hardcoded in NPM.

There’s a nothing to work with it. I should have to switch to Haproxy because I have a Client.

jeffryjdelarosa commented 2 years ago

At the Edit Proxy Host window, click Custom locations and click the gear button, set the following attributes to an appropriate value

image

This not work at all.

jk111 commented 2 years ago

Having the same issue.

jenny787 commented 2 years ago

i found out the issue. it's not nginx. it's cloudflare. they have 100mb limit. nothing you can do.

jeffryjdelarosa commented 2 years ago

i found out the issue. it's not nginx. it's cloudflare. they have 100mb limit. nothing you can do.

I don’t have CloudFlare, I have different clients with they own hosting so it’s nginx ,

jenny787 commented 2 years ago

oh ok! then please let us know if you come up with anything. I've tried everything i could think of.

yoution commented 2 years ago

it works, when set the advanced tab image

jeffryjdelarosa commented 2 years ago

it works, when set the advanced tab image

What version do you have? because it doesn't work to me. I have the lastest one.

yoution commented 2 years ago

it works, when set the advanced tab image

What version do you have? because it doesn't work to me. I have the lastest one.

I install it in docker, the docker version is jc21/nginx-proxy-manager:latest

jeffryjdelarosa commented 2 years ago

it works, when set the advanced tab image

What version do you have? because it doesn't work to me. I have the lastest one.

I install it in docker, the docker version is jc21/nginx-proxy-manager:latest

Didn’t work to me.

JustinChasez commented 2 years ago

None of above solutions worked for me.

ignaciochemes commented 2 years ago
app.use(BodyParser.json({ limit: '50mb' }))
app.use(BodyParser.urlencoded({ limit: '50mb', extended: true }));
yuri2peter commented 1 year ago
app.use(BodyParser.json({ limit: '50mb' }))
app.use(BodyParser.urlencoded({ limit: '50mb', extended: true }));

This solved my problem. Koa reported the same error, which led me to mistake it for nginx.

jeffryjdelarosa commented 1 year ago

Could you please let know where I can do it .

jeffryjdelarosa commented 1 year ago
```js
app.use(BodyParser.json({ limit: '50mb' }))
app.use(BodyParser.urlencoded({ limit: '50mb', extended: true }));

This solved my problem. Koa reported the same error, which led me to mistake it for nginx.

Could you explain me the step to reproduce

ignaciochemes commented 1 year ago
```js
app.use(BodyParser.json({ limit: '50mb' }))
app.use(BodyParser.urlencoded({ limit: '50mb', extended: true }));

This solved my problem. Koa reported the same error, which led me to mistake it for nginx.

Could you explain me the step to reproduce

import express from "express";
const app = express();
app.use(BodyParser.json({ limit: '50mb' }))
app.use(BodyParser.urlencoded({ limit: '50mb', extended: true }));
golyalpha commented 1 year ago

it works, when set the advanced tab image

This absolutely worked for me. Make sure you don't have any other servers between the client and your application (CF, another nginx server between NPManager and your app for hosting static files, etc.) and that your application itself is configured to allow large requests.

dkyeremeh commented 1 year ago

it works, when set the advanced tab image

This absolutely worked for me. Make sure you don't have any other servers between the client and your application (CF, another nginx server between NPManager and your app for hosting static files, etc.) and that your application itself is configured to allow large requests.

Worked for me. I realised I had another proxy server in front of NPM. So I updated the client_max_body_size for that server too

LittleNewton commented 1 year ago

It works!

I pushed a giant (13GiB) docker container image to harbor via nginxproxymanager as a reverse proxy service. I did it.

jeffryjdelarosa commented 1 year ago

This is something that have never worked to me

image

image

ymoona commented 1 year ago

I have the same issue, setting client_max_body_size 0; in the advanced tab does not help. How can I debug this?

golyalpha commented 11 months ago

@jeffryjdelarosa @ymoona are you guys sure that there's nothing else between your client and the destination, including the destination itself, that would be blocking too large requests?

If you don't know, and don't have an easy way of telling, you can try to use the HTTP TRACE method combined with the Max-Forwards header to get requests from the intermediate servers - that of course depends on the intermediate servers honouring TRACE and properly decrementing/reflecting requests back based on the header value.

jeffryjdelarosa commented 11 months ago

@jeffryjdelarosa @ymoona are you guys sure that there's nothing else between your client and the destination, including the destination itself, that would be blocking too large requests?

If you don't know, and don't have an easy way of telling, you can try to use the HTTP TRACE method combined with the Max-Forwards header to get requests from the intermediate servers - that of course depends on the intermediate servers honouring TRACE and properly decrementing/reflecting requests back based on the header value.

No it's not. because it's my own Server, I've test it without proxy, I mean by VPN and do it direct to IP. and Ngnix are blocking the Large Request and evemore there's some request that can take later longer and ngninx stop it or cancel that request.

golyalpha commented 11 months ago

@jeffryjdelarosa hang on, you're not actually getting a 413 HTTP status code, but just a connection closed error? That doesn't seem to be related, but regardless - what is the size of the upload you're making and how long does it take before you get the error?

ymoona commented 11 months ago

I'm quite sure there is nothing else in between. But I'll use the trace to see if that explains something. But I did find that when I create a new proxyhost with the correct settings it does work (large payloads). But applying the same settings to the existing proxyhost will not fix the existing one. Even deleting and recreating does not fix it. That makes me think that either:

  1. The route is different for that proxyhost
  2. There is some leftover config after deleting the proxyhost

I'll check more.

popel1988 commented 11 months ago

What Docker Image do you run? linuxserver.io image? @jeffryjdelarosa @nfacha

PrieserMax commented 8 months ago

Hey, I also have that issue. In my case I'm trying to proxy Immich (https://github.com/immich-app/immich) through NPM. If I adjust the custom config as following: "client_max_body_size 0;" I get an error code 500 after uploading ~1.5G I also tried "client_max_body_size 100000m;" and added "proxy_max_temp_file_size 100000m;" too. The same error occurs.

The error is shown in Immich as following: immich_error With no custom config I'm getting this error: immich_error_2

PrieserMax commented 8 months ago

In my case I got it working with the following settings: proxy_request_buffering off; proxy_buffering off; client_max_body_size 100000m;

jhit commented 6 months ago

@PrieserMax Thank you very much for the solution. Worked like a charm for me. I use Treunas Scale, Nginx Proxy Manager and Immich.

I had this error while importing my photo library with immich-cli:

...
statusText: 'Request Entity Too Large',
...

When the immich-cli tried to upload a very large video file the program was throwing an error.

This is my Custom Nginx Configuration in NPM, that resolved the issue:

proxy_request_buffering off;
proxy_buffering off;
client_max_body_size 0;
gehkah commented 2 months ago

This is indeed working. Thank you Sir

@PrieserMax Thank you very much for the solution. Worked like a charm for me. I use Treunas Scale, Nginx Proxy Manager and Immich.

I had this error while importing my photo library with immich-cli:

...
statusText: 'Request Entity Too Large',
...

When the immich-cli tried to upload a very large video file the program was throwing an error.

This is my Custom Nginx Configuration in NPM, that resolved the issue:

proxy_request_buffering off;
proxy_buffering off;
client_max_body_size 0;
j-norwood-young commented 2 months ago

This might help someone: I got this error when I was sending POST data with a GET request to Nginx, through HAProxy, to a Node.js app. I suspect HAProxy was the one complaining as I'd recently added it into the mix, which caused an automated test (where I'd forgotten some POST data in a request that had changed to a GET) to start failing. I removed the POST data and the error went away.

The following caused the error:

curl --request GET 'https://blah.com' \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'test=blah'