Keats / jsonwebtoken

JWT lib in rust
MIT License
1.69k stars 271 forks source link

v7 discussion #76

Closed Keats closed 4 years ago

Keats commented 5 years ago

There are quite a few changes happening in the PRs: more input format for RSA (#69, #74), ECDSA signing & verification (#73) as well as some Validation changes.

I have already removed the iat check in the next branch since it isn't something that should be checked.

Right now, Validation::algorithms is a vec. I don't remember why but it shouldn't be the case, it should be an algorithm: Algorithm instead, I will accept a PR for that or do it myself later.

69 also adds a standalone verify_rsa fn, which I'm wondering if we can somehow streamline it with the rest of the crate.

Since Rust doesn't have default parameters, we always have to pass a Validation struct currently. Maybe we can put the decoding onto this struct instead so you get something like:

// currently
let token = decode::<Claims>(&token, "secret".as_ref(), &Validation::default())?;

// possible
// `JwtDecoder` has the same fields as the current `Validation`
let token = JwtDecoder::default().decode_hs256::<Claims>(&token, "secret".as_ref())?;

This way we don't have a single function where we are trying to fit all arguments inside and the user has to select explicitely which decode fn to use. This solves the vec of algorithms at the same time and allows having better documentation for each. The downside is duplication of those functions/docs for the various digest values (decode_hs256, decode_hs384, decode_hs512 for each algo).

Any other ideas/criticisms/things missing?

ccing the people involved in various issues/PRs recently @AaronFriel @jbg @Jake-Shadle @matthew-nichols-westtel @greglearns

Dowwie commented 4 years ago

@rib JWTs are used for all sorts of antipatterns, such as carrying state around (user profile info). Further, I'm not currently extending my JWT but I have seen others do so for uses such as carrying around authorization related info, beginning with (but not limited to) that which is provided by a "scopes" claim or custom claims such as "roles" or "permissions". So, at a minimum, an extended validation ought to be able to accept a range of different authorization logic. I don't think I'm going to ever use JWT for authorization but don't want to stop others from doing so. Give programmers a sufficiently open-ended and let them decide how to use it.

If an extend_validation closure took the claims of the JWT and a generic type, such as serde::Value, our closure can deserialize the Value to whatever DeserializeOwned type it expects and then perform authorization however it likes, or whatever other validation a programmer wants to inject here rather than at another step.

Why wouldn't this be sufficient? I'm not defending this design as much as looking for issues..

rib commented 4 years ago

Right, I'm using JWT tokens in an authorization setting with AWS Cognito, which includes a "scope" claim, and Cognito also lets developers add arbirary custom claims to the tokens got from Cognito after authentication.

I guess 99% of those claims can be verified, either by comparing with a fixed value, matching against a regular expression, or matching against a set of either constants or regular expressions.

There could be a usecase for having closures too, so I was curious if you had a specific example where the above flexibility might not be enough.

I would still tend to think it would be good if jsonwebtoken could expose the Value/Map that's deserialized internally and also let you opt out of deserializing into a custom claims struct, so that it's essentially possible to deserialize the claims once in a non-lossy way. That would potentially also be enough to satisfy many use cases that would have to resort to a closure for validation since you can inspect everything in the claims Value.

Beyond this though I think it'd be useful if multiple middleware layers could be involved in configuring a Validator (such as an Azure/Cognito specific middleware that can validate claims specific to those services) while still allowing a user to further validate clustom claims. To support this model might imply that it should be possible to attach multiple closures to a validator.

Although it's not optimal currently; if you needed to run arbitary claim validation with jsonwebtoken currently you could ask it to deserialize your claims into a Value with the current API (I.e. pass a Value as your custom claims struct), so then you have everything you need to run arbitrary custom validation too.

Keats commented 4 years ago

My idea was that custom validation would be done after jsonwebtoken by the user checking the data in their struct so they can use their own custom errors since it's not really a jwt error. As mentioned, we can easily decode a token in a serde::Value if needed and that might be a better way than adding a closure if the fields you want to validate are not in the end struct for some reasons.

Dowwie commented 4 years ago

@Keats in that case, there isn't any additional work to be done, right? That's fine with me. I'm trying to help get work done that is blocking a release.

Dowwie commented 4 years ago

@JTKBowers do you think you'll be able to commit some time to work on these items?

Dowwie commented 4 years ago

@Keats I listened to your podcast interview. You've created so much value for the community. Thank you! https://rustacean-station.org/episode/007-zola/

Dowwie commented 4 years ago

@Keats would you consider releasing v7 as-is? looks like @JTKBowers will not deliver.

Keats commented 4 years ago

I think I'll have some time tomorrow to work on it.

Keats commented 4 years ago

EncodingKey/DecodingKey are in https://github.com/Keats/jsonwebtoken/pull/113

It would be great if some people could review that and see if that fits everyone. It does look good from my pov.

Dowwie commented 4 years ago

@Keats I've commented in the PR

Dowwie commented 4 years ago

@rib I think it would be helpful if Vincent got feedback from another user of the library other than just me. Could you try out the latest EncodingKey/DecodingKey PR and see if it suits? We had a valuable exchange in the PR comments that may help you.

rib commented 4 years ago

I guess I'm kinda biased since I ended up coming up with a different design and am using that now ;-p

Taking a look though, one thing I'm wondering about is whether it would be good if there were more rigorous checks that a Decode key that was constructed with a hmac secret vs rsa key vs eliptic curve key could only be used to decode a token with a matching algorithm. Internally most of the keys can be coerced via as_bytes() and so then it looks like you could essentially interpret an rsa key as a hmac or ec secret. The same thing for the Encode keys; they all coerce internally to a [u8] and so there's no type checking that a key is use with the intended kind of algorithm.

Another key constructor that I think would also be useful is for a hmac secret that's base64 encoded - instead of requiring the user to have to decode it first themselves. I've seen a number of other jwt libraries provide that convenience and have also found myself needing that.

Keats commented 4 years ago

Both are good points and should be easy to add.

Dowwie commented 4 years ago

@Keats easy to add, you say? :) are you going to inspect the magic bytes? If that works, might be worth adding this to the infer crate and using accordingly rather than crafting something custom?

Dowwie commented 4 years ago

@rib great points

rib commented 4 years ago

@Dowwie the key/secret type is defined by the constructor used (e.g. from_rsa_pem() implies type=RSA), so there shouldn't really be any need to infer from the raw byte data (and notably since a hmac secret could be anything then it's not really something that can be done reliably by looking at the data itself).

Keats commented 4 years ago

Exactly. It needs to add from_rsa_der and from_ec_der rather than just der but then it is trivial

Dowwie commented 4 years ago

@Keats what do you think-- is this the final stretch before releasing v7?

Keats commented 4 years ago

I think so, I might have time to do the patch Sunday and I'll release a rc on crates.io just to let more people easily try it.

Keats commented 4 years ago

A problem with DecodingKey and using that method is that Validation takes an array of Algorithm so it is possible to fail when comparing the algo families if a user puts multiple unrelated algorithms for some reasons. I also don't remember why it is an an array to be honest.

Dowwie commented 4 years ago

Looks like spring-security supports multiple algorithms.

rib commented 4 years ago

Based on some of the issues discussed here, I would err towards making it hard to mix up algorithms at runtime: https://auth0.com/blog/critical-vulnerabilities-in-json-web-token-libraries/

E.g. users will want to be 100% sure they can't be tricked into treating an RSA public key as a HMAC key at runtime.

This is simpler to reason about if the library has strong key-type safety and only one expected algorithm can be specified when verifying a token.

The cases where multiple algorithms might need to be selected at runtime are probably uncommon enough that it's reasonable to require application to manually determine the algorthm first (which should almost certainly not be based on decoding the header and trusting the given "alg").

Dowwie commented 4 years ago

The None-attack mentioned in that article is addressed by resolving to one of Algorithm's supported variants. None is not one of the supported variants. A supported algorithm must be used. Agreed?

As for the "algorithm misdirection" scenario, this requires analysis and literally stepping through much of the flow of jsonwebtoken. I don't think that the Auth0 author's claim is true here but the only way to know for sure is to verify.. verification.

Keats commented 4 years ago

I think the multiple alg is there when you know the alg family but not sure which one. Ie you might put [RSA256, RSA384, RSA512], so in general a single family. BUT someone could put [RSA256, EC256] and then the key check would fail if we check all members. Which is probably fine.

This crate is not vulnerable to the issue in the article:

rib commented 4 years ago

yeah, I was mainly pointing to the article just to support the idea of erring on the side of caution and avoiding unnecessary flexibility with regards to letting multiple algorithms to get accepted within one verification pass.

I suppose if someone created a Validation allowing HMAC and RSA (or EC) they could open themselves up to this vulnerability. I'm not sure why you would do that, but the api does seem to allow you to shoot yourself in the foot and maybe it would be better if it just wasn't possible to even make that mistake?

Dowwie commented 4 years ago

@rib here's plausible scenario as to why a programmer would do that: a monolith server that for internal communication among servers uses RS512 where as web clients receive HS256

As Keats confirmed, a server that supports both algorithms still is not exposed to the "algorithm misdirection"

Keats commented 4 years ago

https://github.com/Keats/jsonwebtoken/pull/113 has been updated with encode/decode key type checking wrt the algorithm + base64 secrets

Anything else missing?

Keats commented 4 years ago

Ok this got merged, anything else missing for downstream users? cc @rib / @JTKBowers

LeviSchuck commented 4 years ago

I am quite excited and happy to have contributed.

Keats commented 4 years ago

@LeviSchuck you did the biggest bits of this release!

Dowwie commented 4 years ago

@Keats looks good to me. can you cut another alpha release? I want to run against tests..

Keats commented 4 years ago

I've just pushed beta.1 to crates.io

Keats commented 4 years ago

I'll release it on Monday if nothing else comes up

Dowwie commented 4 years ago

looks good from here, so far..

Keats commented 4 years ago

I didn't forget about it, I got a new laptop yesterday and I was getting some weird rustc crash which might be resolved now.

Keats commented 4 years ago

It's on crates.io