Closed weeco closed 2 years ago
I understand the issue. Let me check if this is possible within a single mutex scope.
Sorry for the delay. As far as i can tell it would require a lot of code duplication to make this possible, which is why i don't like the option too much. If you use https://pkg.go.dev/github.com/ReneKroon/ttlcache/v2#Cache.GetByLoader
won't it give you almost atomic updates? At least you will only do a single fetch, each time a fetch is requested.
Since this is a cache i'm not really inclined to provide specific atomic support, since the fact that it is a cache will mean that there will always be possibilities for stale data and eventual consistency.
Can you check if the GetByLoader
works for you? If not i'd like to understand your usecase better.
I have a similar use case where I'd like something in the sense of sync.Map's LoadOrStore().
The use case is to only execute something once but the Get()
and Set()
being two different operations mean inconsistencies and race conditions.
I could use a mutex around ttlcache but for now I'll just end up using sync.Map for the time being as I don't care much about its size but was hoping to use ttlcache so it handles the ttl eviction.
i did a small proof of concept, but it essentially comes down to rebuilding the cache around syncmap. I'm not planning that but it could be an idea for a fork.
I'd like to update an existing cache item in an atomic fashion. Since there is currently no such method I tried to work around it by first calling
GetWithTTL()
and then updating the returned item usingSetWithTTL()
.Unfortunately doing this I had a race condition because another Go routine would access/modify it between the Get and Set call. I don't see a workaround because I can't access the mutex.