heroku-python / dj-static

DEPRECATED - use WhiteNoise instead!
http://kennethreitz.org/introducing-dj-static/
BSD 2-Clause "Simplified" License
512 stars 71 forks source link

Static requests are slower with dj-static than without #10

Closed cool-RR closed 11 years ago

cool-RR commented 11 years ago

We've been using Django's builtin static serving, and today I gave dj-static a try. (We're on Heroku, for what it's worth.)

It made everything slower. For example, one static request that previously took 85ms, now takes 466ms. Another request that took 322ms, now takes 1,120ms.

Any idea why?

(One suspicion I have is that dj-static isn't emitting the correct cache headers, causing CloudFlare to not cache them correctly. But even so... I don't think that explain a 1,120ms response time for a static file.

kcolton commented 11 years ago

FWIW - I would do the testing directly against Heroku for a real Apples to Apples comparison. CloudFlare's caching infrastructure is a giant complicated black box. I would try to remove that as a variable.

To ensure proper cache control headers, I would test caching separately with a simpler and more predictable upstream cache like Varnish where I can see all the logs and exactly whats going on.

I was going to run a test like this myself, so interested to see the results.

On Mon, Aug 5, 2013 at 9:26 AM, Ram Rachum notifications@github.com wrote:

We've been using Django's builtin static serving, and today I gave dj-static a try. (We're on Heroku, for what it's worth.)

It made everything slower. For example, one static request that previously took 85ms, now takes 466ms. Another request that took 322ms, now takes 1,120ms.

Any idea why?

(One suspicion I have is that dj-static isn't emitting the correct cache headers, causing CloudFlare to not cache them correctly. But even so... I don't think that explain a 1,120ms response time for a static file.

— Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/dj-static/issues/10 .

kennethreitz commented 11 years ago

I've done thousands of requests per second out of a single 'static' process.

pydanny commented 11 years ago

Like Kenneth, I've done the same. Serving up requests via this method is pretty darn fast.

The real issue is that the only information you are providing is "It's running slow!".

I don't mean to be a jerk, but without being able to see any of the details of your application (Procfile, architecture, files being served, wsgi.py, settings.py, et al), there is absolutely no way for anyone to do any sort of analysis of your issue.