loong / go-concurrency-exercises

Hands on exercises with real-life examples to study and practice Go concurrency patterns. Test-cases are provided to verify your answers.
Other
1.11k stars 378 forks source link

2-race-in-cache Test Cache/Page size fail #14

Closed anargu closed 1 year ago

anargu commented 3 years ago

Hi, I'm trying with this solution

// Get gets the key from cache, loads it from the source if needed
func (k *KeyStoreCache) Get(key string) string {
    //mutex.Lock()
    //defer mutex.Unlock()

    k.rw.RLock()
    e, ok := k.cache[key]
    k.rw.RUnlock()

    if !ok {
        k.rw.Lock()

        // Miss - load from database and save it in cache
        p := page{key, k.load(key)}
        // if cache is full remove the least used item
        if len(k.cache) >= CacheSize {
            end := k.pages.Back()
            // remove from map
            delete(k.cache, end.Value.(page).Key)
            // remove from list
            k.pages.Remove(end)
        }
        k.pages.PushFront(p)
        k.cache[key] = k.pages.Front()

        e = k.pages.Front()
        k.rw.Unlock()
    } else {
        k.rw.Lock()
        k.pages.MoveToFront(e)
        k.rw.Unlock()
    }

    return e.Value.(page).Value
}

But after run the tests, it throws me:

--- FAIL: TestMain (3.75s) check_test.go:20: Incorrect cache size 99 check_test.go:23: Incorrect pages size 181 FAIL exit status 1 FAIL github.com/loong/go-concurrency-exercises/2-race-in-cache 5.811s

After lookin that rule, I see it checks that cache and pages are always on their max length (100) but if the Get functions is executed concurrently adding and removing items, could be possible that pages and cache size can not always be 100?

NeteaseWright commented 2 years ago

No, the purpose of the test is that these two values should always be the same and equal to 100 in the end. The reason they are different is that you are adding duplicate pages.

loong commented 1 year ago

Closing as resolved by @NeteaseWright 🙏

Here is a solution without creating duplicate pages. 🤫