Open chrmarti opened 1 month ago
@maro-otto Could you share the output of docker info --format '{{json .}}'
when you have a GPU installed? I think we might additionally have to check what the default runtime is.
@chrmarti
docker info --format '{{json .}}'
gives me (no GPU attached)
{nvidia-container-runtime [] }
@maro-otto This looks like the output from docker info -f {{.Runtimes.nvidia}}
, could you also run docker info --format '{{json .}}'
with the GPU present?
@chrmarti Sorry for the late reply With attached GPU I get a similar result
docker info -f {{.Runtimes.nvidia}}
{nvidia-container-runtime [] <nil>}
Additionally nvidia smi gives me nvidia-smi Wed Oct 9 06:39:57 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 550.90.07 Driver Version: 550.90.07 CUDA Version: 12.4 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 Tesla T4 On | 00000000:00:04.0 Off | 0 | | N/A 38C P8 9W / 70W | 1MiB / 15360MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+
+-----------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=========================================================================================| | No running processes found | +-----------------------------------------------------------------------------------------+
Originally posted by @maro-otto in #9385