Closed franciscop closed 3 years ago
Hey Francisco,
This seems like a reasonable strategy to take, given that each browser vendor can choose to implement their own arbitrary limits. It's worth noting that RFC6265 provides a little bit of guidance in terms of what expectations should be met, at a minimum:
6.1. Limits
Practical user agent implementations have limits on the number and size of cookies that they can store. General-use user agents SHOULD provide each of the following minimum capabilities:
o At least 4096 bytes per cookie (as measured by the sum of the length of the cookie's name, value, and attributes).
o At least 50 cookies per domain.
o At least 3000 cookies total.
The real bummer is that there's multiple scenarios that could prevent a cookie from being set properly, and it's not always (if ever) possible to determine what the exact cause is, given the loosey-goosey limits. Cookies could be disabled in the browser, an individual cookie could be too large, there could be too many cookies for a single domain, or there could be too many cookies in total.
It occurs to me that it's also theoretically possible that the Cookies.js enabled
property ends up being false
due to one of these limits having been hit, but while browser cookies are still enabled. Perhaps the end result is equivalent to if cookies were truly disabled, but it seems semantically incorrect.
In any case, it could be good to check if a cookie actually got set, and if not, throw an error. However, I don't think we can easily include why the cookie failed to be set in the error message (beyond giving possible suggestions/ideas as to what the cause might have been).
Related issue: https://github.com/ScottHamper/Cookies/issues/49
As a side note, the idea of having a cookie with 4+ KB of data in it kind of gives me the heebie-jeebies. This isn't to say we shouldn't loudly throw errors when a cookie fails to be set, but it seems a gross abuse of cookies to attempt to cram that much data into one. But I also honestly don't understand why anyone would use cookies for anything more than authentication/identification/anti-CSRF tokens (which should be HttpOnly anyways) and simple UI preference settings (grid sorts, applied filters, etc).
In addition to browsers failing silently with single large cookies, HAproxy, Apache2 and others may reject all requests or responses which exceed the maximum HTTP header size configured in the server. While browsers may reject server responses (or the servers themselves).
It could result in the inability of the server to remove the offending cookies via a Set-Cookie header, by being prevented from even receiving requests from such a browser.
I have encountered this when I had to move PHP sessions into cookie storage to enable scalability and fix some problems with too many sessions stored on one server, for a legacy application at the company I work for.
As a side note, Brotli helped a lot :)
Estimating the header size was the way to go, and then destroying data inside cookies or entire cookies if size got out of control (memory leaks in these storage cookies sometimes).
What I mean is: the size of a single cookie is not that important, but the size all all HTTP headers which may contain multiple cookies.
This cookie library (as my own cookies.js so far) appears to be failing silently when trying to set a cookie too large:
JSFiddle with the error | | JSFiddle without the error (small cookie)
A possible solution would be to make this check manually when setting a cookie; set it, read it and compare it with the value that you were supposed to set. If they are different throw an error. Would love to know your thoughts on this. I am also considering allowing for an error/fallback function.
Edit: found Browser Cookie Limits, but still it's probably better to check if it was set properly.