NginxProxyManager / nginx-proxy-manager

Docker container for managing Nginx proxy hosts with a simple, powerful interface
https://nginxproxymanager.com
MIT License
22.02k stars 2.54k forks source link

Admin Interface is INSECURE #2640

Open pwfraley opened 1 year ago

pwfraley commented 1 year ago

Checklist

Describe the bug

In the README.md it say: Beatifull and Secure Admin interface, the Admin Interface including the Login is served over HTTP and not HTTPS so it is by definition NOT secure!

Nginx Proxy Manager Version

All Version up to and including current Version

To Reproduce Steps to reproduce the behavior:

  1. Install NPM as described in the docs
  2. Open the Admin interface

Expected behavior

Expect the Admin interface to be secured by SSL

Screenshots

Operating System All

Additional context

moninformateur commented 1 year ago

Unless I am missing something, you can secure the admin panel the same way you can secure any of the proxy hosts by requesting a SSL certificate in the config. For extended security, you'd obviously use either the acces list or an authentication service to prevent anyone from accessing your admin panel.

This may be a documentation error, or simply something to spook people away from exposing their NGINX admin panel to the internet. You know.

the1ts commented 1 year ago

This I guess suffers from the "chicken and egg problem" which came first. How can NPM (a simple to use interface for securing with SSL) secure itself before standing up the interface to enable the User to put in the required pieces to get a valid cert? I would not call it simple to use if the user had to put the required info in a config file or docker variables. There are other nginx docker images that don't use an interface so have nothing to secure if that is a real issue for you. Also if NPM was stood up with a self signed cert we would just fill here with browser error messages and people asking for us not to use self signed since there security checks now fail. I would stay we are stuck with what we have. Stand up NPM, use it to secure itself with SSL in the standard fashion and use firewalls to block the HTTP admin for security purposes, the admin interface is on a different port to enable all this to be done easily.

bmmmm commented 1 year ago

I agree with @moninformateur and @the1ts - block admin panel from being accessible from outside your network.

pwfraley commented 1 year ago

Hey thanks for the Feedback. I guess I did not make this ticket clear: Saying it has a secure admin interface is just plain not true. Out of the Box it is extremely insecure and without further steps it will never be secure. I guess there are two simple options:

  1. Remove Secure Admin interface as a feature (bad option for the project)
  2. On first start of the container create a selfsigned SSL Cert and use this to secure the admin interface (not perfekt but much better than NO SSL)

The Option: Don't expose your admin port outside of your network, basically means you can not use NPM on a Puiblic Cloudbased hosts because you can not reach the admin interface. Or one would have to install a VPN and connect that to the cloud based host, and then allowing the admin port only over the VPN (anything but simple for non admins).

Personally I would prefer the Option to generate a self signed certificate on first container start and then using this to secure the admin interface.

jc21 commented 1 year ago

Just so I'm clear on my statement of the admin interface being secure:

Just because it doesn't have ssl out of the box doesn't mean that my claims of the interface being secure isn't true.

I would also argue that the admin interface is only made less secure if the port 81 was exposed outside of your network and to the public without first setting up a Proxy Host for it.

pwfraley commented 1 year ago

I understand your reasoning, but it does not matter how secure the implemented features are, if the basis is insecure, than everything that builds on it, no matter how secure, is insecure.

I don't know if you remember the mongodb meltdown. MongoDB was able to be used and installed securely, but out of the box it was configured extremely insecure (no root password), which eventually led to most mongodb installs beeing insecure. After some researchers found a ton of mongodb instances online without root passwords and the news spread that really hurt mongodb for a couple of years.

I mean seriously, how hard is it to generate a self signed certificate on first container start? Isn't that just a couple of lines of code?

the1ts commented 1 year ago

Yes, a signed cert is easy to make, getting users to recognise that the error message that makes in browsers is safe to ignore is not. Isn't getting users to click through the security warning to use NPM going to look very bad? Also since the location NPM is made for is to sit behind a NAT router how does the admin port get on the internet and therefore very vulnerable unless you ignore the getting started guide which says to forward only 80 and 443? You are worrying about a theoretical risk that is understood and allowed to exist due to ease of use. Not sure I take the MongoDB point, you can say the same for MySQL, Elastic search, kubernetes, vmware, none of those have been damaged by poorly educated users getting hacked by putting them on the internet directly for no good reason with zero or poor passwords and not keeping up to date. We can make 100% secure software, it would just not be useable by 99.9% of the world. See Windows NT being given the US DOD highest certificate for security only if it has no NIC installed.

pwfraley commented 1 year ago

I agree with @moninformateur and @the1ts - block admin panel from being accessible from outside your network.

That is only secure if the network that it is attached to, is secure. And as security experts say:

There is no such thing as a secure network.

If you have WiFi in your network, how do you prevent someone from connecting? With things like Kali Linux, hacking into someone else WiFi is a kids game now a days. I also know that one could add a firewall rule for port 81 to only allow connections from one host, but that is not realy secure either. Mac and IP Spoofing is simple. Also that does not prevent someone on your network to see the traffic and read the password that is beeing transfered in plain text.

I don't want to step on any toes here, but security should be taken seriously. And knowingly shipping software that installs itself in an insecure manner out of the box, but telling people it is secure basically just hurts the project and that would be a shame.

the1ts commented 1 year ago

You do not need to allow any access to the admin port outside the docker container so all the problems inside a LAN are null and void from then onwards. If a person is on your network to level you mention they can probably man in the middle HTTPS to an IP address with the same browser errors we would need to tell users to ignore. How do you want it fixed that does not break the simple to use nature which is issue one in the reason for it's existence?

dioxide-jazz commented 1 year ago

I agree with @moninformateur and @the1ts - block admin panel from being accessible from outside your network.

That is only secure if the network that it is attached to, is secure. And as security experts say:

There is no such thing as a secure network.

If you have WiFi in your network, how do you prevent someone from connecting? With things like Kali Linux, hacking into someone else WiFi is a kids game now a days. I also know that one could add a firewall rule for port 81 to only allow connections from one host, but that is not realy secure either. Mac and IP Spoofing is simple. Also that does not prevent someone on your network to see the traffic and read the password that is beeing transfered in plain text.

I don't want to step on any toes here, but security should be taken seriously. And knowingly shipping software that installs itself in an insecure manner out of the box, but telling people it is secure basically just hurts the project and that would be a shame.

You can make your wifi network secure against cracking by making a very complicated pword. you only need to enter it once in a device. you prevent people from phishing you by educating yourself and your team/family on how to recognize the signs. the world is unsafe but we do what we can to mitigate.

it seems like you are concerned about the fact that its not SSL out of the box. I opened this to see if there was some sort of exploit in the components the GUI is built with or some way to access the db by fuzzing the web portal or something but this i am not concerned about and heres why:

this VM or bare metal server that you are spinning this container up on shouldn't be accessible from the outside at first. this is why you want this program no? so you can have a safe way to proxy traffic in and out of your network? so i would hope this isn't accessible just from WAN just yet however, if it is on a VPS somewhere or you are settimg this up from remote connection then i see your concern. but here is what you do - you log in with the default creds, set up the first host being the NPM web console get your lets encrypt cert and you then reconnect using the FQDN that will now proxy to that the NPM console at port 81. then just change the password to what you want.

since you have made the connection through HTTPS when you reset the pw then it should not be as liable to compromise as just over plain HTTP.

Coreparad0x commented 1 year ago

Yes, a signed cert is easy to make, getting users to recognise that the error message that makes in browsers is safe to ignore is not. Isn't getting users to click through the security warning to use NPM going to look very bad? Also since the location NPM is made for is to sit behind a NAT router how does the admin port get on the internet and therefore very vulnerable unless you ignore the getting started guide which says to forward only 80 and 443? You are worrying about a theoretical risk that is understood and allowed to exist due to ease of use. Not sure I take the MongoDB point, you can say the same for MySQL, Elastic search, kubernetes, vmware, none of those have been damaged by poorly educated users getting hacked by putting them on the internet directly for no good reason with zero or poor passwords and not keeping up to date. We can make 100% secure software, it would just not be useable by 99.9% of the world. See Windows NT being given the US DOD highest certificate for security only if it has no NIC installed.

I would go as far as to say getting end users to get in the habit of disregarding the invalid cert security message is bad practice as well. Not only is it pointless at that point, because someone with that level of access could just MITM the traffic and inject their own cert which the user will then just blindly trust (as you point out), but it could lead them to doing it in other situations when they shouldn't be. At least with no cert you know exactly what you're getting, and don't have some false sense of security.

however, if it is on a VPS somewhere or you are settimg this up from remote connection then i see your concern. but here is what you do - you log in with the default creds, set up the first host being the NPM web console get your lets encrypt cert and you then reconnect using the FQDN that will now proxy to that the NPM console at port 81. then just change the password to what you want.

Even then I wouldn't expose the interface. If you're using a VPS where you have the access to set this software up, then you likely have SSH and you should just be using public-key-only auth with SSH and forwarding the port so you can access the internal admin URL locally. I have several things hosted in the cloud on services like Digital Ocean where the backend admin panels are locked behind having to SSH in and forward the remote port. I go as far as to have an entirely separate VPS setup which we SSH into so that the one hosting the actual public facing application doesn't even have to expose SSH, and I have SSH blocked in Digital Oceans firewall on everything but the "bastion" (as we call it.) For instance, ssh -L 81:<VPS Private IP>:81 <ssh server> and then I hit it with localhost:81

it seems like you are concerned about the fact that its not SSL out of the box. I opened this to see if there was some sort of exploit in the components the GUI is built with or some way to access the db by fuzzing the web portal or something but this i am not concerned about and heres why:

I'm in the same boat, this is also why I clicked on this. I was expecting to find some kind of vulnerability in the software itself. This is a problem you can solve with a bit of extra configuration, and not one that's really solvable out of the box.

Maximus48p commented 1 year ago

related....

When i create an access user and set this up for a certain proxy it will ask me, using a small pop-up, to enter a username and password before i'm able to enter the proxy url. image

This pop-up windos is not secured, therefore name and passwords are not send in a secured way.

ghost commented 1 year ago

You can mitigate the insecure access to the admin panel using the following method:

  1. Create an A or AAAA record for the nginx-proxy-manager
  2. Using the interface, create an SSL certificate for the A or AAAA record you created image
  3. Create a Proxy Host for your A or AAAA record and forward it to http://localhost:81 image
  4. Add the following rule to your iptables: iptables -I DOCKER-USER 1 -p tcp -m conntrack --ctorigdstport 81 --ctdir ORIGINAL -j DROP
brickpop commented 11 months ago

Using a self signed certificate for the management UI is an easy solution, I don't see the reason to dismiss such a basic idea.

1) If the intended way to manage the service is by setting an SSH tunnel to the server itself, then at least:

# ssh -N -L <local-port>:localhost:81 <user>@<server>
ssh -N -L 8080:localhost:81 user@my-server.net

# Then navigate to localhost:8080 on your browser

2) If the intended way is to start with the default credentials via HTTP, then:

Coreparad0x commented 11 months ago

Using a self signed certificate for the management UI is an easy solution, I don't see the reason to dismiss such a basic idea.

I don't personally have that much of an issue with self-signed certs on the admin interface. That being said, I don't really see why there's an issue with NPM - a tool that out of the box handles grabbing LE certs from any number of validation methods, including DNS - couldn't offer to set itself up on a host with an LE cert configured like any of the others.

The mere presence of LetsEncrypt assumes that your server will be on a public network

I don't really see how you can draw this conclusion. LetsEncrypt allows you do a number of validation methods, including DNS validation which works whether the deployment is public or private. This is what I do at home, and this is what I've done at work even when using it for internal sites.

Portainer itself uses a self signed certificate

As a user of Portainer I wouldn't mind if it offered to configure itself with a cert using LE DNS validation as well.

All of that that being said:

If the intended way to manage the service is by setting an SSH tunnel to the server itself, then at least:

This is what I would do with all of this stuff anyways. I would personally be fine having this on a self-signed cert being a tunnel or VPN. This is how I do all of my public deployments on places like Digital Ocean. It's easy to setup, easy to use, and far more secure than just having your NPM admin interface exposed publicly IMO.

Edit:

Sending your admin credentials over HTTP is insecure, even within the same network, even through a VPN

Fully agree here. That being said, self-signed requires you to add an exception for the cert. It puts you in a position to be more easily MITM due to the fact that getting an invalid cert warning is much less suspicious. If I have an LE cert, and get an invalid cert popup, then it's a bit suspicious. That being said I'm not really sure what scenario you would be getting MITM and not already have an attacker in a position where you're already screwed anyways. At your house or work, that means they're already in your network. At a hotel, well you shouldn't be connecting to it from a hotel or other public network without at least a VPN or SSH tunnel.

vs4vijay commented 10 months ago

I concur with @pwfraley that nginx proxy manager should be served on HTTPS rather than HTTP, as this would prevent anyone between the nginx host and your internet provider from intercepting credentials. As suggested by many, I would also suggest that the Admin Portal use HTTPS by default. This would be easy to do, as Caddy does this automatically.

kanevbg commented 7 months ago

+1. There is a reason for HTTP2 being used only with SSL by browsers. I would have named that issue as "Use HTTPS for admin panel" and mark as Improvement/Feature Request and label it with security.

ReezyBoi commented 4 months ago

Chiming in to support the NPM Admin UI being served over self signed HTTPS either by default or with a line of code in the initial yaml file than can be uncommented. That's way, there are options.

nikhilweee commented 3 months ago

Just came in here to say that I followed @brickpop's solution and changed my docker compose file to only expose port 81 to 127.0.0.1 instead of all interfaces (0.0.0.0). The corresponding change is highlighted below:

version: '3.8'
services:
  app:
    image: 'jc21/nginx-proxy-manager:latest'
    restart: unless-stopped
    ports:
      # These ports are in format <host-port>:<container-port>
      - '0.0.0.0:80:80' # Public HTTP Port, exposed to all interfaces
      - '0.0.0.0:443:443' # Public HTTPS Port, exposed to all interfaces
      - '127.0.0.1:8081:81' # Admin Web Port, exposed only to localhost

I can then port forward 8081 on my remote host to localhost using SSH and access the admin UI.

bilogic commented 2 months ago

Isn't it odd that NPM is able to acquire SSL certs for its records but won't do so for itself?

And in order to expose NPM securely to the internet we need another tool that basically does the exact same thing that NPM is trying to do in the first place.

I mean I doubt there is much use for NPM if it wasn't able to acquire LE SSL certs as part of its functionality.