Closed Geekmeleon closed 6 months ago
json-file
logging driver in Docker has a lot of overhead (produces very large files for the amount of logging it stores). If you don't need the extra parsing functionality it gives you, I recommend switching to either the local
logging driver, or if you have some kind of central logging manager on your system, switching to a logging driver that takes better advantage of that, such as syslog
or journald
.You could add Logrotation 15MB times 3 in Docker to solve this. Pretty sure the image itself isnt that verboose, but if you let it run long enough with debug-logs it can grow quite big i guess.
Im closing this issue as resolved, feel free to reach out again if you need help.
If you like this project, please consider giving this repo and the Docker-Hub-Repo a Star.
Have you read the Important information text above
Current behavior
I noticed that my Ubuntu server was out of space. It used to have 40+GB free before Palworld. After doing some research I found that the json log file was 40GB in space. I never had to limit a container log before. This was the log file within the contianer and not the data location.
Desired behavior
Should be a limit to how big the file gets.
Links to screenshots
To Reproduce
Steps to reproduce the behavior: In Ubuntu Server console Log In run command " sudo du -BG /var/lib/docker/containers | sort -nr | head -n 20" Find the largest directory and list its contents with size with the following command: "sudo du -aBM /var/lib/docker/containers/randomhashdirectory" inside I found the json log file that was over 40GB in size
Software setup
Hardware setup
Additional context
I had to recreate the docker using the log max-file and max-size option (see screenshot)