Hi!
I am trying to use pepper for variant calling with docker (with GPU). I already have docker and CUDA 12 installed. I installed PyTorch inside a conda environment: when I try to import pytorch inside this environment it works and torch.cuda.is_available() returns True. However when I launch pepper-deepvariant from the same environment I get the following error: ERROR: TORCH IS NOT BUILT WITH CUDA.
Is this because docker is not able to read from conda libraries? I also tried installing CUDA Toolkit and restarting docker but I still get the same error.
Thank you, I hope you can help with this
Marta
Hi! I am trying to use pepper for variant calling with docker (with GPU). I already have docker and CUDA 12 installed. I installed PyTorch inside a conda environment: when I try to
import pytorch
inside this environment it works andtorch.cuda.is_available()
returns True. However when I launch pepper-deepvariant from the same environment I get the following error: ERROR: TORCH IS NOT BUILT WITH CUDA. Is this because docker is not able to read from conda libraries? I also tried installing CUDA Toolkit and restarting docker but I still get the same error. Thank you, I hope you can help with this Marta