mhrivnak / radiotherm

python client library for Radio Thermostat wifi-enabled home thermostats
BSD 3-Clause "New" or "Revised" License
50 stars 40 forks source link

Caching needed #15

Open JerryWorkman opened 8 years ago

JerryWorkman commented 8 years ago

Some sort of caching of the HTTP request data would significantly improve performance.

Each variable evaluation triggers a new HTTP request in radiotherm. Each request takes 5 to 20 seconds on my thermostat. A cache of GET requests would provide much better performance.

tubaman commented 7 years ago

I think caching is explicitly not supported

TD22057 commented 7 years ago

I agree w/ Jerry. The problem is that many home automation systems will poll all the data from the thermostat every interval - and they don't know that for this type of thermostat, that means multiple url requests. Making 1 request is much better than making n requests since the thermostat is so slow to respond. So it would be nice if caching could be turned on as an option so that the multiple requests for the same url don't need to be repeated. A simple time check should be sufficient I would think.

tubaman commented 7 years ago

It's nearly just as easy to use a pre-existing cache solution like beaker

JerryWorkman commented 7 years ago

Really?

Why depend on another caching solution?

On Mon, Sep 4, 2017 at 12:29 PM, Ryan Nowakowski notifications@github.com wrote:

It's nearly just as easy to use a pre-existing cache solution like beaker http://beaker.readthedocs.io/en/latest/caching.html

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/mhrivnak/radiotherm/issues/15#issuecomment-326998408, or mute the thread https://github.com/notifications/unsubscribe-auth/ABsbQOv13XGf0t6G_9ABK9MQZ-r4zhOBks5sfCVWgaJpZM4HHZqi .

tubaman commented 6 years ago

Because caching could potentially add a lot of complexity. For example, do you want caching in RAM or on disk? Do you want caching across multiple processes? That will probably require a cache daemon like redis or memcached. Why build in limited caching that might not cover all the use cases when you can use a robust full-featured cache:

# setup radiotherm and beaker
tstat = radiotherm.get_thermostat('thermostat')
cache = CacheManager()
tstat_cache = cache.get_cache('tstat', expire=10)

# get cached temp
temp = tstat_cache.get(key='temp', createfunc=lambda: tstat.temp)
tubaman commented 6 years ago

Alternatively, if you really want caching built-in, use Python-TStat instead.

craftyguy commented 5 years ago

@tubaman I'm not familiar with beaker.. does it handle all of these cases you mentioned automagically?

For example, do you want caching in RAM or on disk? Do you want caching across multiple processes?

If not, then a simple approach, e.g. where each instantiation of the thermostat class has its own cache, would be trivial to implement and cover 99% of the use cases I am aware of where a user is just using one instance of the class to communicate with the tstat. Or, are there lots of folks all hitting the tstat from multiple processes/systems?

skimj commented 5 months ago

Passing in the optional model parameter helps significantly. https://github.com/mhrivnak/radiotherm/blob/614a2516b8bdb1d3c1b8cf3c641a9034c5adcf55/radiotherm/__init__.py#L63 Without it, an instantiation is preceded by a query to the device to get the model. (can be each POST/GET if the class instance isn't retained) As OP noted, these queries are slow.

In a way, it allows you to cache the model information in user space and save up to half the HTTP calls to the unit. It isn't a full caching solution but it has been good enough for me.