Closed nomandera closed 4 years ago
We can request as many certs as we need, and the box is programmed to be able to do it (in case the box is serving so many domains they won't fit in a single cert), but requesting just one is a bit nicer internally - less moving parts, less opportunities for something to go wrong.
We could automatically group by zone, maybe. A zone is the highest-level domain the box knows about (so each domain and its subdomains would get a single cert).
But.... This is a nice-to-have, imo, not a crucial thing, so I don't really want to play around with this right now.
Is there a way I can manually achieve this currently without breaking the auto-renewal (even if it means removing domains from currently single zone). The reason I spotted this so quickly is that I am hosting 2 sites for a friend which I dont want associated with my day job domain.
If you delete the certs and then manually issue for particular domains (see #689; you can specify more than one domain at a time on the command line) you can get separate certs, but in 60 days they'll get merged into one again. If you go the old route and issue a long-lasting cert, then you can hold off on this problem for a year.
Thanks for this. With a bit of thought I grouped all my domains into just 4 groups of similar in nature content and ran the commands. Now that I know what I am doing I will probably do as you suggest and script it per zone.
I will just have to manually redo this every 60 days until such time as we get this feature natively. Appreciated.
I believe something has changed or perhaps I have just got my timings wrong.
Since Jan 2016 https://github.com/mail-in-a-box/mailinabox/issues/690#issuecomment-177505555 I have been running /root/mailinabox/management/ssl_certificates.py
with a set of domain names which successfuly created small bundles of similar domains.
However today I received this error:
skipped: XXX.org:
The domain has a valid certificate already. (The certificate expires in 89 days on 04/15/19. Certificate:...
I understand the original advice was to delete the existing cert first but for 3 years I have not had to do that.
Any ideas? The ideal solution would be one cert per domain which would also close down this security issue.
Polite bump. Can we discuss plans to move to "one cert per domain"?
A small related note is that all publicly-trusted certificates are logged to certificate transparency. This doesn't mean this issue won't help in some way - just that there's something else to consider if your aim was also to absolutely keep all your domain names "private".
At a minimum, this means you should prefer wildcard certificates over subdomain-specific certificates.
I'm open to change, but I don't know what's involved in issuing one per domain. Maybe it's easy.
I would like to discuss this again as it seems since Firefox 71 Subject Alt Names
are now treated as a first class citizen in their new Certificates Viewer.
That is to say originally finding Subject Alt Names
meant quite a few clicks in a non user friendly interface
Old interface Style (not specific example)
but now, in Firefox at least, it is presented after a single click in a webpage view that makes it instantly accessible even to those that are not actively looking.
New interface Style (not specific example)
It used to be possible to just run ssl_certificates.py
even on domains with valid certs but now it doesn't let you The domain has a valid certificate already
.
Obviously you can delete all the certs and start again but this breaks mailinabox automatic and is a kludge.
tl;dr its time to do this right and issue certs per domain. Does anyone know where to start?
There's a function that assembles a list of certificates to get, each a list of domain names. That function needs to be modified. However, I really don't like the idea of going all the way to one certificate per domain. It's cosmetic --- there's no security advantage because as @zatricky noted, there are several ways to enumerate the domains that point to a server. And because we provision certificates for a lot of subdomains on each domain (I have 73 domains & subdomains on my box), that would turn into a lot of separate certificate requests, which might take a very long time.
So what I would probably accept at this point is a change to grouping certificates by DNS zone. A zone is a domain name that is not a subdomain of any other domain name hosted on the box. So a zone would contain e.g. mailinabox.email plus all of its subdomains but not ツ.life and other domains hosted on the box. We already have a function for getting a list of zones. We just need to start with that, and then group each zone with all of its subdomains. (I have 17 zones on my box --- that's a lot better than getting 73 separate certificates.)
That makes sense. Maybe we should we consider staggering each zone request over time.
There is a window where LetsEncrypt starts to notify about expiration and when they actually expire. Staggering within this window would reduce the potential impact of hitting too fast and any countermeasures they may have in place now or the future also making it much more scalable.
It's sort of nice to have everything expire at the same time so that there are less notifications about new certificates, but staggering also might be nice.
Having been operating with LE using custom acme clients for commercial things, I suggest using wildcard certificates (one cert per domain) to simplify matters in the long run. It's two validations per domain (.example.com and example.com) compared to one validation per sub*domain.
I pushed a commit (linked above) that provisions TLS certificates grouped by zone. So on the next re-issue for a domain, domains won't get combined with unrelated domains.
This is quite some months later but it seems that now as off 0.51 and the timings of certs expiring each TLD I provide has its own cert by default. I havent tested a new install but I would assume it is the same.
This ticket is closed already but just to follow up with the thanks.
:)
When enumerating a server for potential attack vectors one of the initial data gathering phases is an attempt to discover all the domains hosted on the server.
Up until this point it has been an inexact science where the best you could achieve was Google data mining, www mining for domain references or to use one of the "who is hosted on" services which try learn about the worlds DNS (and tend to do it not very well, charge you and/or never last long).
With the way letsencrypt works in MIAB by default all you need to do now is open the SSL cert and you get a complete server wide domain list.
Equally from a human standpoint you often dont want to make it easily known what personal "hobby" sites you run on the same server as you "small business/CV/Code" sites.
I dont want to go messing about with this without throwing this in here for comment. Ideally every site would have their own SSL or perhaps "groups of sites" is a better fit.
Is there any way to achieve this without hacking MIAB or beaching letsencrypt AUP this now?