Open igor-kremin opened 6 years ago
Hi @igor-kremin, sorry for the late reply on this issue. It's taken a while for me to get the time to respond to this.
I think the issue you've brought up is valid, if I've understood it correctly. I'm not the biggest fan of the while loop however. How do you feel about using cache.get_or_set()
instead? I think get_or_set()
might handle this race condition.
Hi, get_or_set will not solve double execution, I tried this before and had some problems on it. this while loop waits while a function in another process finished and sets the result in cache. Some times it is better to wait 0.1 -2 sec than execute function again. Especialy when functions takes some time.
Realistically I don't think we can merge in a change like this, because for some applications waiting for 0.1 seconds would not be acceptable, and in my opinion django-cache-utils
should not be designed in a way that excludes these use cases.
Is there a way we can solve your use case while at the same time avoiding the use of time.sleep()
? What issue did you run into with get_or_set()
?
Hello Lucas, sorry for long delay,
When the first run of cache_backend.get(),
the function is not yet executed (the function result does not exist yet), and what you put in the cache? So I decided to put a True
value to avoid duplicate execution of the function. cache_backend.add()
an atomic function, it is important.
It's make semaphore True
until function ends and on end it replace True with result of the function.
Usualy this decorators used on heavy (long) function and the delay 0.1 s is minor,
I did this changes in my project when I wrote the first message here , and everything is going well till now.
Hello, I with help of my friend found an issue which is in when getting a value from cache. When two processes goes one by one, and function takes some time to return result, both are checking the value in the cache, for example it is absent and both cache.get returns None, then both run function, which makes duplicated action.
So to avoid this behaviour you should use lock key and use cache.add function which returns True if value pushed in cache succesfully and false if it is already in the cache and 0 if cache backend is not runing. cache.add is atomic function.
Here is my implementation of cached decorator