alastairtree / LazyCache

An easy to use thread safe in-memory caching service with a simple developer friendly API for c#
https://nuget.org/packages/LazyCache
MIT License
1.72k stars 159 forks source link

Contention on multiple readers & writters #53

Closed juancarrey closed 6 years ago

juancarrey commented 6 years ago

There is two levels of contention

1) Readers being locked by writters 2) Application level locking

Readers being locked by writters

Reader -> Some client that will hit reading from cache Writter -> Some client that will miss reading from cache and will set the value.

When having multiple readers and writters (Threads) there is a lot of contention here: https://github.com/alastairtree/LazyCache/blob/feat/netcore2/LazyCache/CachingService.cs#L95

Every reader&writer will lock even if the item is already in cache, and we do not need to lock for just reading.

Adding an unlocked read try at the top would remove a lot of contention when many threads are accessing the same items (which is the point of caching)

Application Level

As locking happens at the IAppCache level, we are blocking all readers-writters of the whole app at the same time.

Meaning, that if one writter takes 5 seconds to read something from the backing storage A, there could be 10 readers waiting to read something from the cache (which was stored some time ago from storage B).

Changing the locking mechanism from "Per Application" to:

A) Per Type -> Table locking B) Per Type + Key -> Row locking

On B) There would need to be a mechanism to dispose not used locking semaphores.

A) Can be achieved creating one cache service per type being cached, but as is, the locking happens at higher level possible.