Open rejuce opened 6 months ago
Interesting, I have not looked at the docker/cuda/windows combo, apologies for that. I'll try to spend some time this week poking at it.
the default way for doing this seems to be to derive the image from nvidia/cuda:base,, my wsl environment / docker environment supports CUDA as I have other containers running that can use it (eg jupyterlab). but the base image epuc2tts derives from, does not include CUDA stuff
In addition to my updating the Dockerfile to use an image with native GPU support, anyone wanting to run the container with CUDA support will need to install the NVIDIA Container Toolkit:
WARNING: The NVIDIA Driver was not detected. GPU functionality will not be available.
Use the NVIDIA Container Toolkit to start this container with GPU support; see
https://docs.nvidia.com/datacenter/cloud-native/ .
Need to determine whether the image with CUDA support will run fine in environment without CUDA, or if we need to create a specific GPU/CUDA image and one without GPU support.
There is now a cuda12 container "docker pull ghcr.io/aedocw/epub2tts:release-cuda12", though I tested with the following and it does not detect the GPU:
alias cudaepub='docker run --gpus=all -e COQUI_TOS_AGREED=1 -v "$PWD:$PWD" -v ~/.local/share/tts:/root/.local/share/tts -w "$PWD" ghcr.io/aedocw/epub2tts:release-cuda12'
cudaepub testing.txt --engine xtts --speaker "Damien Black"
I will poke around and see what else I need to install, though at this point I think that container should work properly and support CUDA GPU.
i have cuda, my wsl envirnment reconises it , but i suppose the docker container from this project does not yet support cuda?
(there are no cuda toolkit commands available and also epub2tts does not run with cuda)
I entered now the container over bash an running directly /opt/epub2tts.py