Nuanda / smogmapper

Smog Mapper
http://smogmapper.smogathon.pl/
1 stars 0 forks source link

Readings - sensible caching strategy for getting all-sensor json #14

Closed Nuanda closed 8 years ago

Nuanda commented 8 years ago

When running the PM heatmap animation, production of data for a single step takes up to 340-430 ms on the server side (it's not DB, mainly AR and JSON wrapping). We may probably make it faster a bit.

But the nice side of things is that sensor readings, once recorded, should never change. That means we may cache these JSON responses and throw them at clients with light speed. This needs to be smart, as (due to temporal nature of things) the same request query URL will expect different response, when asked in, for instance, 30 minutes time. Caching should take that into consideration.

Open question to @mkasztelnik and @kammerer - what server-side cache solutions, that you know, would suit such scenario?

mkasztelnik commented 8 years ago

I'm using redis for plgapp, but it is not perfect. Memcache should be faster, but it is another component which need to be configured and managed.

More interesting issue is how to store elements to be reusable in different period of time. Maybe cache for every heatmap step with expires_at set to heatmap_interval - (Time.now - step_time).

Nuanda commented 8 years ago

We don't need them reusable for any future moment - just for the "nearest" future (if we allow users to dig deeper, we may go without cache for those queries).

For instance, if we plan to show 1 day heatmap animation, caching 100-120 all-sensor jsons (per single measurement type) is enough - people will only request these. So the cache should expire in around 24 hours.