fedwiki / wiki-server

Federated Wiki client and server in Node.js
Other
153 stars 35 forks source link

TLS/SSL Support #53

Closed almereyda closed 8 years ago

almereyda commented 10 years ago

Instead of using an TLS/SSL Proxy, that breaks the current implementation of Federation, I'm investigating ways to use Node's https module for integrated security support.

Appearantly, Public Key Client Authentication + HTTP fallbacks, if needed, will have to be implemented.

This is aimed at private wikis, that are directly connected to the internet. Having federation working between them is needed for corporate environments.

I believe that this requires a fork and an extension of the library itself, but wonder if the current plugin-engine would also be able to handle this? Reading of pluginsFactory in the code doesn't give me the impression.

Any additions / thoughts / concerns / ideas are highly appreciated!

WardCunningham commented 10 years ago

We stil support some proxy functionality in the server code. This is used somewhere by the client. I forget where (I looked). I have tried to get rid of it because it interferes with some cross firewall scenarios where the client has more privilege than the server.

https://github.com/fedwiki/wiki-node-server/blob/master/lib/server.coffee#L324-L331

paul90 commented 10 years ago

We really should avoid creating a separate fork for this.

The required changes are not limited to the server, the client currently is hard-wired to use http, rather than https. So, some intelligence will need adding to the client so that it uses TLS/SSL where necessary.

Ideally it would be good if the server was capable of being configured to listen for TLS/SSL, both directly but also indirectly behind a proxy (with TLS/SSL terminating in the network layer - for example when using TLS/SSL with OpenShift). Internally how the request is being processed, once pass the security layer is no different with TLS/SSL.

Something I have say previously, is that it would be nice if we could modularize the security layer - "security plugins". So that different Authentication/Authorization models could be implemented without needing to create a new fork of the server.

Longer term, there is HTTP/2 which will only be over TLS.

paul90 commented 10 years ago

From the Express api - http://expressjs.com/api.html#app.listen

Just a matter of using something like

http.createServer(app).listen(httpPort);
https.createServer(options, app).listen(tlsPort);

rather than app.listen

Of course if you only want https, than http could be omitted, but it might be better to redirect to https.

almereyda commented 10 years ago

This is why I asked.

I see four cases of Transport Modules

I see that routing requests and the expectations of the client are quite complex. Especially when thinking about Mixed-Protocol-Federation between different Farms.

But thinking about the proposed commenting and signing infrastructure (@rynomad) , let's collect our ideas on

These could be the Security Modules.


@paul90


At last the ongoing refactoring seems to streamline parts of the code into clusters, starting with the Storage Layer and Client Plugins, now moving on to Security and Transport Modules.

I suppose it makes sense to use some kind of planning this time, to design an abstract flow-model and test it against use cases. Or would you just create a branch and code until it finally works?

rynomad commented 10 years ago

I've got a little experience in client side keys/certificates, and yes, I for one have been working on the multi-user-per-domain use case (treating the client as it's own wiki within the 'hub' of a domain).

The gotchas and potential problems with key/cert generation and signing/verifying in the client side are numerous, and honestly until we get a native web crypto API in the browser the security offerings of client side data encryption are slim to nill, but as NDN needs it to function, this is what I've come up with:

step 1: use SSL/TLS to reliably transport javascript crypto library to client. step 2: generate key. step 3: encrypt private key with salted password (ideally in such a way that the browser understands that a password is being entered and offers to save it) step 4: save encrypted public/private keys and any certificates in local storage. step 5: unencrypt keys with password prompt and immediately pass them into a webworker, where they live for the session, taking great care to appropriately scope this flow to insulate it from any other javascript running in the UI thread.

almereyda commented 10 years ago

Maybe it's good news that OpenID Connect has been released as a standard? Coincidentally I've stumbled across the news today and read a little. It seems to fulfill some of your authentication prerequesites.

There are some implementations of it and affiliated topics for Node:

Further readings:

Specs:

Finally I even see the chance that wiki-client could turn into an Unhosted Webapp, if we integrated such an authorization layer. Because here Logins go with access tokens that can also be used for apps, therefore we'd have a distributed auth mechanism to even allow forking of pages between different secured servers. If one has credentials for both systems, one could just request an auth token from the first and pass it to the second that's sending its request independently of the client. Which was the point where my investigations started, becaus it threw an HTTP Error.

Also we could spare us the huzzle (Is that an english word?) of Client Certificates. Personas (and many others) would still be possible. I could even think of integrating an (optional) auth proivder into wiki and making it available as a plugin.

Am I going to far here?

rynomad commented 10 years ago

@christiansmith might want to weigh in on this... he's been doing some work on an authorization server. I forget the details of his implementation, but i just pinged him so hopefully he'll have some insight.

Jon, you're reading my ming about the unhosted webapp possibility. This is exactly where my work with NDN has been taking me: pure p2p in browser wiki federating goodness.

christiansmith commented 10 years ago

As @rynomad pointed out, I'm working on a standalone open source OAuth 2.0/OpenID Connect provider implementation. The notion of static web apps and token protected services looks so much like the future of web architecture to me that I've bet the proverbial farm on it.

So @almereyda's suggestions about OpenID Connect and an unhosted client make perfect sense to me. In the context of fedwiki, the Discovery portion of the spec may be the most interesting:

This specification defines a mechanism for an OpenID Connect Relying Party to discover the End-User's OpenID Provider and obtain information needed to interact with it, including its OAuth 2.0 endpoint locations.

http://openid.net/specs/openid-connect-discovery-1_0.html

Federated identity + federated content. It's like chocolate and peanut butter.

@WardCunningham perhaps this would be a good topic for the hangout one week?

michielbdejong commented 10 years ago

unhosted client make perfect sense

sounds great! would love to join that discussion, let me know if i can help/advise

paul90 commented 10 years ago

It seems to fulfill some of your authentication prerequesites.

Not sure about this, as far as I can see the authorization that is talked about is the end-user authorizing the OpenID Provider (OP) to pass their identity to the RP (Client).

It does not directly help the RP (the fedwiki server) making the authorization decisions about any actions the end-user may make (using the fedwiki client).

I could even think of integrating an (optional) auth proivder into wiki and making it available as a plugin.

The problem with supporting different auth providers is in itself not a problem, though the interop becomes an interesting question. Though this only becomes serious when/if the a neighbourhood contains nodes using different authentication method, and authorization is required. Currently, with only updating pages on the origin(?) server, and all having read access it is not a problem. But, add in support for an 'unhosted client', or even simple the ability to save to servers other than the origin, together with support for restricting access, this starts to become a bigger problem.

christiansmith commented 10 years ago

I would be inclined to think of the relying party/OpenID provider relationship a little differently. A static wiki client could be the RP and each wiki server could also be an OpenID provider. The wiki server could then make authorization decision for requests directly to itself, or on behalf of another wiki server.

I may be mistaken, but I believe this is where "claims" encoded in the tokens come into play. One such claim is the "Issuer" of the token. This could be used by the recipient to determine which OP to validate the token against.

Access control policy is another piece of the puzzle, but within my admittedly imperfect understanding, this type of scenario seems to work.

christiansmith commented 10 years ago

Dynamic Client Registration, Discovery and possibly even self-issued providers appear relevant to the concept...

WardCunningham commented 10 years ago

Yes, let's talk about this on wednesday this week. We will have to start out at a remedial level as I'm having trouble following the conversation in this issue.

I'm a little worried that SSL is not something easily supported by small players simply because of the cost of the certificates, especially where wildcards are involved.

Can we avoid the asymmetry we see in radio where a $100 will make a transmitter but one needs $1,000,000 to license it. wikipedia

almereyda commented 10 years ago

I'm a little worried that SSL is not something easily supported by small players simply because of the cost of the certificates, especially where wildcards are involved.

Just a quick intervention: What about using CACert?

That would free the fedwiki community from financial issues and support their movement.


Update 16 Feb 15: Let's encrypt (on GitHub)and IndieCert (on GitHub) try to takle @WardCunningham's concerns from right above.

WardCunningham commented 9 years ago

Do these certs allow wildcards?

paul90 commented 9 years ago

No, Let's encrypt have said that they will not be supporting wildcard at launch. There is an closed issue no.66 asking that question that includes this answer (sure this was something we touched on in a hangout a little while ago):

We have not decided whether or not to support wildcards yet. Support for wildcards will most likely not be part of our initial offering.

and more recently added,

we will support names with multiple Subject Altnerative Names (SANs or UCC certificates in the CA nomenclature) at launch.

But that is probably only of limit help for those who only infrequently create new sites. Though it might be enough to make a start, but without wildcards, as long as there is something more concrete about when they will be adding support.

paul90 commented 8 years ago

Closing this issue as a lot has changed in the last 2+ years.

We now have Let's Encrypt, and certificates on demand (automatic HTTPS) in caddy server. But, we also have browsers blocking loading plain http content when using a https origin. Plus a host of other interesting issues/opportunities.