hyperium / http

Rust HTTP types
Apache License 2.0
1.16k stars 291 forks source link

Question - why is there a max URI length? #462

Closed mlodato517 closed 3 years ago

mlodato517 commented 3 years ago

I'm running into an issue very similar to https://github.com/seanmonstar/reqwest/issues/668#issuecomment-744059897. To get production working for the near future I was going to fork this and change the MAX_LEN to be u32::MAX - 1 until reqwest is maybe updated to return a Result instead of panicking.

Before doing that - I wanted to make sure I wasn't opening myself up to any severe issues - is there a reason for the current URI max length being what it is?

olix0r commented 3 years ago

If a client can force hyper to allocate u32::MAX bytes per-request, this becomes a Denial-of-Service vector for internet-exposed services.

dekellum commented 3 years ago

The current u16::MAX - 1 (= 65,534) seems pretty generous, from what I recall of browser and common server behavior.

mlodato517 commented 3 years ago

Interesting - so basically hyper is trying to "not be a tool" used for DoSing?

dekellum commented 3 years ago

Sounds, like reqwest panic'ing could also be a DoS vector, however.

Better to fix that in reqwest.

mlodato517 commented 3 years ago

If hyper is worried about being used as a tool for DoSing I'm interested in how useful this guard is. If I were trying to do damage it seems trivial to patch hyper to allow a larger limit.

Again, the "real" solution is to not .expect this in reqwest but, if for whatever reason, your application can reasonably expect to receive a request of 100k characters that you might need to modify/proxy off to another service you manage then there'd still be the issue of reqwest returning a Result::Err and you wouldn't be able to send a request you wanted to.

dekellum commented 3 years ago

Oh, I think the primary concern that motives the limit here, is a reasonable "default" maximum URI for users implementing servers?

mlodato517 commented 3 years ago

Oh, I think the primary concern that motives the limit here, is a reasonable "default" maximum URI for users implementing servers?

I'm not sure I understand. Are you clarifying the limit as more for how much data one can receive instead of send? And then is the security just ensuring the server doesn't have to spend a lot of time parsing a lot of long requests? (sorry I'm very green on the subject of security here)

dekellum commented 3 years ago

No worries.

Are you clarifying the limit as more for how much data one can receive instead of send?

yes

And then is the security just ensuring the server doesn't have to spend a lot of time parsing a lot of long requests?

That could also be a factor, but not the primary one: Holding more than (2^16) 64KiB for an URL per pending request could be an easy DoS. For example, your proposed limit of u32::MAX is 4GiB, could easily kill a server in one (1) request!

mlodato517 commented 3 years ago

Great, that all makes sense. Thanks for those explanations!

mlodato517 commented 3 years ago

Actually, I wonder if I might re-open this :thinking:

If a server wanted to accept URLs that were, for instance, 100k characters - should they be able to? Currently, if they're using this library, they just can't, can they? Should this max length be configurable - maybe a parameter with a default size or an environment variable or something else? Maybe with a local check to warn users if they put in something REALLY big to let them know that they maybe don't want that and are maybe inviting DoS?

dekellum commented 3 years ago

I don't want to get your hopes up (don't have commit bit myself), but in my opinion, the best you could hope for here would be some kind of compile time setting, where the default max continues to be 64KiB. Since cargo features don't work with scaler values (they are just boolean flags) it might have to be a feature named something like "dangerously-disable-max-url-len", but even that kind of sucks because after the first dependency enables it, it can't be disabled (currently).

Forking off might be the better option for you.

mlodato517 commented 3 years ago

Huhhh interesting. Can you explain more about

the best you could hope for here would be some kind of compile time setting

It's not totally obvious to me why we couldn't add a new function that does the same parsing but with a max_length argument. It wouldn't be as ergonomic (which would maybe encourage users to generally stick to stuff like let uri: Uri = string.parse() but would allow those who want a different limit to still use this library.

dekellum commented 3 years ago

Best of luck, if you wish to propose that. I don't really have anything more to add here.