dgilland / cacheout

A caching library for Python
https://cacheout.readthedocs.io
MIT License
421 stars 44 forks source link

Are the default functions single-threaded? #11

Closed Alan-R closed 4 years ago

Alan-R commented 4 years ago

If you have a cache miss, and you want to call the default callable to get the right value for the cache, are the calls to that function (at least for a single key) single threaded? That is, if I have 100 calls all hit at once for the same key, but it's missing from the cache, is the callable only going to be called once?

This is a specific dimension of thread safety. The functions I want to use are thread-safe, but they're sometimes very expensive (several seconds). I don't want to startup 100 identical calls to replace the one cache value...

Alan-R commented 4 years ago

A related follow-up: If you have a single semaphore for the cache, then the default callable can't examine the cache for other values. This would help me know how many caches I need to create.

dgilland commented 4 years ago

The behavior of Cache.get is to lock until it returns a value.

https://github.com/dgilland/cacheout/blob/1225b7c9d22bc374749f5dcd177c55c28227ad46/src/cacheout/cache.py#L222-L223

So if multiple threaded calls to a cache instance for a missing key would result in a 1 call to get the default while the other 99 calls wait for the first call to finish.