Closed prozacgod closed 13 years ago
Good idea. Thanks prozacgod!
NP! - it cause a strange bug in our development house - and was really confused :P for a second. kept thinking it was the browser cache.
(also nicely written code BTW, very clean and decently commented!)
Noticed a possible DOS issue with the update - since request.headers.host is an untrusted value - it would be possible for a person to make a connection and request 100's of host names forcing a cache on each of those hostnames. eating up memory and cpu - what do you think about this?
I second prozacgod's comment. Never trust headers.
I went over a few ways to handle this and all of them end up being esoteric, or require extra configuration from the library user - so my thought would be to cache the filedata on library startup and just do the filter/process on each request - this is wasteful on CPU, but... in a DOS situation, CPU is easy to steal, but its not a long lasting effect like memory usage. you have no triggers built in to force a cache flush so - caching needs to be kinda jailed from the clients.
a timer for cache flush would be useful, perhaps a 1hr timeout on the cached data?
If you have any other input on this let me know, I'll implement the changes on monday (CST)
Better yet, does the now.js script when ran client side, have access to things like the domain name and port it was requested from? ... then we could do away with the filter altogether.
Sorry, this got lost in the other noise. It is possible to DOS the server in the current implementation. A time to live on the cache would be the best way to prevent a malicious person from using up all the memory.
We're hosting from multiple domain names, and found an issue where the caching assumes that only one hostname/port combo will ever access now.js - fixed it.
switched nowFileCache from a simple object to an object keyed by [request.headers.host] to contain the now.js file for that specifc host.