natefinch / lumberjack

lumberjack is a log rolling package for Go
MIT License
4.8k stars 591 forks source link

Memory Leak? #63

Open Michael-F-Ellis opened 6 years ago

Michael-F-Ellis commented 6 years ago

I'm quite new to Go (decades of programming in C, Python and other languages) so please excuse if this is a dumb question.

I coded the following trivial test and let it run for several days. It writes a timestamp and message count to a log file once every 5 seconds. When first launched, top showed 672K memory usage. The memory report has slowly increased every day since. As of now after 5 days running it's at 4880K.

Otherwise the log rotation and compression are happening as expected. The code was built with go1.10.2 darwin/amd64 and is running on OS X El Capitan 10.11.6 Darwin Kernel Version 15.6.0.

// This is a test of go logging using the lumberjack module
// to handle file rotation.
package main

import (
    "log"
    "gopkg.in/natefinch/lumberjack.v2"
    "time"
)

func main() {
log.SetOutput(&lumberjack.Logger{
    Filename:   "/Users/mellis/gologging/logging.log",
    MaxSize:    1, // megabytes
    MaxBackups: 3,
    MaxAge:     7, //days
    Compress:   true, // disabled by default
})

    log.Println("Entering main")
    for i := 0; i < 300000; i++ {
        time.Sleep(5 * time.Second)
        log.Println("message ", i)
    }
    log.Println("Exiting main")
}

What am I doing wrong? Thanks!

xiegeo commented 6 years ago

I don't see anything wrong with your code. However there is a lot going on including garbage collection and compression, so 4M more memory usage is not out of the ordinary.

If you want to investigate details, use https://golang.org/pkg/net/http/pprof/. Try with the compression off and no sleep to see if it stabilizes quickly. Then turn on compression, that could be where your memory is going, but that should also stabilize after a few log rotations.

Michael-F-Ellis commented 6 years ago

@xiego, Thanks that was helpful. I reran my test with and without compression and with a 1 ms sleep. The memory consumption, after a few hours, did seem to stabilize in both cases -- somewhere around 5.5M without compression and 6.5M with.

Thanks also for the reference to pprof. That's going to be useful tool for me.

natefinch commented 6 years ago

As Xiegeo mentioned, it's a garbage collected language, which means memory usage of the application can't be directly correlated to specific lines of code, because the runtime caches stuff any only frees occasionally etc. But if it keeps going up and up forever, then something is wrong.

geeksbaek commented 6 years ago

I am having similar issues now. I am using this package on a production http server that logs about 1GB per day, and it is doing log rotation every 100mb. The total size of the current log file is 4.2GB, and the http server occupies almost 4.2GB of memory. This is a big issue for me now, and a fundamental solution is needed.

This is my actual code.

log = &logrus.Logger{
    Out: io.MultiWriter(os.Stdout, &lumberjack.Logger{
        Filename:  "log/log.txt",
        MaxSize:   100, // MB
        LocalTime: true,
        MaxAge:    365, // Days
    }),
    Formatter: &prefixed.TextFormatter{
        TimestampFormat: "2006-01-02 15:04:05.000",
        FullTimestamp:   true,
        ForceFormatting: true,
    },
    Hooks: make(logrus.LevelHooks),
    Level: func() logrus.Level {
        if debug {
            return logrus.DebugLevel
        }
        return logrus.InfoLevel
    }(),
}
xiegeo commented 6 years ago

@geeksbaek You need to reproduce the bug and isolate the library first. Just add a main that calls log and run it to reproduce the same problem as on your server. Then you need to simplify your code and remove references to logrus and prefixed before you can say that this might be a problem in lumberjack. Or just use pprof to see where the memory is going.