karlseguin / ccache

A golang LRU Cache for high concurrency
MIT License
1.29k stars 120 forks source link

Fix memory leak on cleanup #79

Closed bubblesupreme closed 1 year ago

bubblesupreme commented 1 year ago

We found a memory leak when cache cleans up in our service. I also tried to recreate cache instead of using 'Clean()', but in that case memory leaks too. I found out that the even if cache is deleted, cache items still remain in memry. With each 'Clean()' call, memory overhead grew. I also tracked the number of items using 'runtime.Finalizer' and it also grew. This fix solved this issue for me.

Also I think there is a problem here https://github.com/karlseguin/ccache/blob/master/cache.go#L238 . We must clear the buckets and the list atomically. Otherwise, we can add new items in parallel and then they will have time to be added to the buckets, but immediately removed from the list and we cannot clear them until the next clearing.

bubblesupreme commented 1 year ago

This is how I count the number of items in memory.

var Counter int64

func newItem[T any](key string, value T, expires int64, track bool) *Item[T] {
    size := int64(1)

    // https://github.com/golang/go/issues/49206
    if sized, ok := (interface{})(value).(Sized); ok {
        size = sized.Size()
    }
    item := &Item[T]{
        key:        key,
        value:      value,
        promotions: 0,
        size:       size,
        expires:    expires,
    }
    if track {
        item.refCount = 1
    }
    atomic.AddInt64(&Counter, 1)
    runtime.SetFinalizer(item, func(_ any) {
        atomic.AddInt64(&Counter, -1)
    })
    return item
}

Script for reproduce

package main

import (
    "fmt"
    "runtime"
    "sync/atomic"
    "time"

    "github.com/karlseguin/ccache/v3"
)

type someStruct struct {
    i int
    s string
}

func main() {
    cache := ccache.New(
        ccache.Configure[someStruct]().MaxSize(300000).ItemsToPrune(500),
    )

    for i := 0; i < 10; i++ {
        fillAndClear(cache)
    }

    time.Sleep(5 * time.Second)

    runtime.GC()
    var statMem runtime.MemStats
    runtime.ReadMemStats(&statMem)
    mem := float64(statMem.HeapAlloc) / (1 << 30)
    fmt.Println(fmt.Sprintf("heap: %f", mem))
    fmt.Println(fmt.Sprintf("items: %d", atomic.LoadInt64(&ccache.Counter)))
    fmt.Println(fmt.Sprintf("cache items: %d", cache.ItemCount()))
    fmt.Println("=======================================")
}

func fillAndClear(cache *ccache.Cache[someStruct]) {
    for i := 0; i < 1670500; i++ {
        cache.Set(fmt.Sprintf("%d", i), someStruct{
            i: i,
            s: fmt.Sprintf("%d", i),
        }, time.Hour)
    }

    var statMem runtime.MemStats

    runtime.GC()
    runtime.ReadMemStats(&statMem)
    mem := float64(statMem.HeapAlloc) / (1 << 30)
    fmt.Println(fmt.Sprintf("heap: %f", mem))
    fmt.Println(fmt.Sprintf("items: %d", atomic.LoadInt64(&ccache.Counter)))
    fmt.Println(fmt.Sprintf("cache items: %d", cache.ItemCount()))

    fmt.Println("clear")
    cache.Clear()

    runtime.GC()
    runtime.ReadMemStats(&statMem)
    mem = float64(statMem.HeapAlloc) / (1 << 30)
    fmt.Println(fmt.Sprintf("heap: %f", mem))
    fmt.Println(fmt.Sprintf("items: %d", atomic.LoadInt64(&ccache.Counter)))
    fmt.Println(fmt.Sprintf("cache items: %d", cache.ItemCount()))
    fmt.Println("=======================================")
}

These are the unfixed results.

heap: 0.151846
items: 300000
cache items: 300000
clear
heap: 0.047103
items: 300000
cache items: 0
=======================================
heap: 0.219982
items: 600000
cache items: 300000
clear
heap: 0.094048
items: 600000
cache items: 0
=======================================
heap: 0.308472
items: 900000
cache items: 300000
clear
heap: 0.140995
items: 900000
cache items: 0

With fix

heap: 0.143366
items: 300000
cache items: 300000
clear
heap: 0.040392
items: 0
cache items: 0
=======================================
heap: 0.135354
items: 300000
cache items: 300000
clear
heap: 0.040402
items: 0
cache items: 0
=======================================
heap: 0.136457
items: 300000
cache items: 300000
clear
heap: 0.040404
items: 0
cache items: 0
=======================================
heap: 0.132072
items: 300000
cache items: 300000
clear
heap: 0.040410
items: 0
cache items: 0
=======================================
heap: 0.135262
items: 300000
cache items: 300000
clear
heap: 0.040411
items: 0
cache items: 0
=======================================
heap: 0.133577
items: 300000
cache items: 300000
clear
heap: 0.040413
items: 0
cache items: 0
bubblesupreme commented 1 year ago

Memory leak during parallel operation of cleaning and setting confirmed. RW mutex is one of the solution. https://github.com/karlseguin/ccache/blob/master/cache.go#L238

karlseguin commented 1 year ago

I think what you're observing here is a limitation of finalizers. From the docs:

If a cyclic structure includes a block with a finalizer, that cycle is not guaranteed to be garbage collected and the finalizer is not guaranteed to run

If I run your code, I see what you're seeing. However, if I remove the added call to runtime.SetFinalizer (or simply revert all changes made to ccache), and run your tests, the memory appears to be fine.

heap: 0.061564
clear
heap: 0.000136
=======================================
heap: 0.061540
clear
heap: 0.000141
=======================================
heap: 0.061512
clear
heap: 0.000141
=======================================
heap: 0.061537
clear
heap: 0.000144
=======================================
heap: 0.061520
clear
heap: 0.000145
=======================================
heap: 0.000143
=======================================
bubblesupreme commented 1 year ago

Yes, runtime.SetFinalizer is behaving strangely, I will close this pr. Created an issue for other problem with cleanup