Closed jmtatsch closed 2 months ago
The nvidia/cuda packages are install via pip, the cuda image isn't required but enabling specific gpus is required. For example:
$ docker run --gpus=all --rm ghcr.io/matatonic/openedai-speech python -c "import torch; print(torch.cuda.is_available())"
True
$ docker run --rm ghcr.io/matatonic/openedai-speech python -c "import torch; print(torch.cuda.is_available())"
False
FROM python:3.11-slim in the non-minimal docker file is not enough to get Cuda support. Consider this: 2.3.0-cuda12.1-cudnn8-runtime as TTS requires torch>2.1 Yes that will make image huge (+3.5GB) but I think there is no slim cuda images.