Open theokrammer opened 1 year ago
I'm seeing this issue as well
I'm also seeing a mismatch between Resources/Advanced/Virtual disk limit setting (I've tried changing it to 8 and 16 GB & restarting) and Docker Desktop's reported VM disk usage which claims the "avail. of" size is around 33GB. This is happening on macOS Ventura 13.4.1, fresh install with no images/containers/volumes set up. Docker Desktop 4.21.1, engine 24.0.2.
Shouldn't the set disk limit and reported availability numbers (circled in red in screenshot) match or am I missing something?
~ % docker system df
TYPE TOTAL ACTIVE SIZE RECLAIMABLE
Images 0 0 0B 0B
Containers 0 0 0B 0B
Local Volumes 0 0 0B 0B
Build Cache 0 0 0B 0B
~ % docker version
Client:
Cloud integration: v1.0.35
Version: 24.0.2
API version: 1.43
Go version: go1.20.4
Git commit: cb74dfc
Built: Thu May 25 21:51:16 2023
OS/Arch: darwin/arm64
Context: desktop-linux
Server: Docker Desktop 4.21.1 (114176)
Engine:
Version: 24.0.2
API version: 1.43 (minimum version 1.12)
Go version: go1.20.4
Git commit: 659604f
Built: Thu May 25 21:50:59 2023
OS/Arch: linux/arm64
Experimental: false
containerd:
Version: 1.6.21
GitCommit: 3dce8eb055cbb6872793272b4f20ed16117344f8
runc:
Version: 1.1.7
GitCommit: v1.1.7-0-g860f061
docker-init:
Version: 0.19.0
GitCommit: de40ad0
~ % docker info
Client:
Version: 24.0.2
Context: desktop-linux
Debug Mode: false
Plugins:
buildx: Docker Buildx (Docker Inc.)
Version: v0.11.0
Path: /Users/xxx/.docker/cli-plugins/docker-buildx
compose: Docker Compose (Docker Inc.)
Version: v2.19.1
Path: /Users/xxx/.docker/cli-plugins/docker-compose
dev: Docker Dev Environments (Docker Inc.)
Version: v0.1.0
Path: /Users/xxx/.docker/cli-plugins/docker-dev
extension: Manages Docker extensions (Docker Inc.)
Version: v0.2.20
Path: /Users/xxx/.docker/cli-plugins/docker-extension
init: Creates Docker-related starter files for your project (Docker Inc.)
Version: v0.1.0-beta.6
Path: /Users/xxx/.docker/cli-plugins/docker-init
sbom: View the packaged-based Software Bill Of Materials (SBOM) for an image (Anchore Inc.)
Version: 0.6.0
Path: /Users/xxx/.docker/cli-plugins/docker-sbom
scan: Docker Scan (Docker Inc.)
Version: v0.26.0
Path: /Users/xxx/.docker/cli-plugins/docker-scan
scout: Command line tool for Docker Scout (Docker Inc.)
Version: 0.16.1
Path: /Users/xxx/.docker/cli-plugins/docker-scout
Server:
Containers: 0
Running: 0
Paused: 0
Stopped: 0
Images: 0
Server Version: 24.0.2
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Using metacopy: false
Native Overlay Diff: true
userxattr: false
Logging Driver: json-file
Cgroup Driver: cgroupfs
Cgroup Version: 2
Plugins:
Volume: local
Network: bridge host ipvlan macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
Swarm: inactive
Runtimes: io.containerd.runc.v2 runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 3dce8eb055cbb6872793272b4f20ed16117344f8
runc version: v1.1.7-0-g860f061
init version: de40ad0
Security Options:
seccomp
Profile: builtin
cgroupns
Kernel Version: 5.15.49-linuxkit-pr
Operating System: Docker Desktop
OSType: linux
Architecture: aarch64
CPUs: 4
Total Memory: 7.668GiB
Name: docker-desktop
ID: 310621ca-fd17-4cd9-93f3-44c85983812f
Docker Root Dir: /var/lib/docker
Debug Mode: false
HTTP Proxy: http.docker.internal:3128
HTTPS Proxy: http.docker.internal:3128
No Proxy: hubproxy.docker.internal
Experimental: false
Insecure Registries:
hubproxy.docker.internal:5555
127.0.0.0/8
Live Restore Enabled: false
Same issue.
One workaround is to decrease Disk Allocated(Virtual disk limit) in Docker Settings and then change it back. Note: This will delete all container data, images, volumes, cache, etc.
That worked for me. Before I applied the workaround, I wasn't able to update Docker Desktop using the built-in self-update feature. The update was stuck in downloading phase. After applying the workaround, it became unstuck.
Same issue here.
Same issue.
One workaround is to decrease Disk Allocated(Virtual disk limit) in Docker Settings and then change it back. Note: This will delete all container data, images, volumes, cache, etc.
Can you plz explain if this workaround can actually set the reported VM disk usage based on the configured virtual disk limit ? I set to 8Gb, the bottom bar reported the same 30/33Gb, I changed to 16Gb & no change. Only when I increased it to 40G, that it reflected in the bottom bar. Then changing it back to 8Gb did not change the reported disk usage. This somewhat implies either a bug or rather that the minimum disk size is fixed somewhere to a minimum of 33G.
Same issue, failing to create new images or launch containers due to no disk space.
I could recover most of the space by deleting cache. These are the steps I took:
Docker Desktop shows 0 available space from a 62 GB virtual disk. After removing all containers and images, it reads Disk 5.72 GB avail. of 62.67 GB
whereas the command line shows:
> docker system df
TYPE TOTAL ACTIVE SIZE RECLAIMABLE
Images 0 0 0B 0B
Containers 0 0 0B 0B
Local Volumes 0 0 0B 0B
Build Cache 866 0 17.69GB 17.69GB
-x Restarting docker didn't help.
-x Updating to latest version of Docker Desktop didn't help (version 4.26.0)
-> Performing a docker system prune
to clear everything, including cache, completed successfully, with Total reclaimed space: 17.69GB
. Docker Desktop now shows Disk 55.76 GB avail. of 62.67 GB
.
-> Presumably, using docker buildx prune
would have also worked to clear just the cache, but I didn't test it since I had already removed all images and containers.
Thus there are about 7 GB missing and I have to recreate all images and containers. Not ideal, but at least I can work with docker again.
Edit to add: I'm using docker during cdk deployments to aws, in case it may have something to do with these mysterious ballooning caches. Most of my images are related to cdk assets.
same issue here, any ideas to fix it? change virtual disk limit
doesn't help
Same issue.
resizing size limit to 8GB and then back to 64GB resulted in this, which is already better!
Description
The VM disk usage reported in Docker Desktop (180 GB) is way bigger than what
docker system df
reports (~30GB).This causes error in containers complaining about not enough space on disk, e.g. nginx:
2023/06/01 14:19:09 [alert] 29#29: *44 write() to "/etc/nginx/stdout" failed (28: No space left on device) while logging request, client: 172.20.0.9, server: localhost, request: "GET /feed HTTP/1.1", upstream: "fastcgi://unix:/run/php/php8.1-fpm.sock", host: "front.bnn.local" Unrecognized input header: 99
After a restart the engine is no longer able to start, no matter if using experimental features or not.
Reproduce
No guarantee that this can reproduce it, i hope the diagnostics are of help. After a reinstall i simply have to work with it a few weeks until i have to increase the VM size.
Expected behavior
Docker Desktop should be able to start the engine when enough space is available, and it should not use over 150GB overhead if the images, build cache etc. only need < 30 GB. Additionally, the space the VM requires should not increase as quickly as it does.
docker version
docker info
Diagnostics ID
76B67562-1517-4F48-930A-3529B3B62A83/20230602074017
Additional Info
I had this issue a few times already, but was not able to capture anything, but could only solve it temporarily by increasing the amount of disk space docker is able to use.
I am appending the diagnostics output from when the App had ~30GB left of the 180GB, even though
docker system df
did not report anything different, as i have not really downloaded or built new images in that time. This was about 1 week ago, so the size this VM uses increases very quickly. docker logs 25.05.2023- 36.22 free of 183.zip