Open M2ATrail opened 2 months ago
I did solve it by coping all the dll files in (cuBLAS.and.cuDNN_CUDA11_win_v2.7z) to C:\Windows\System32 folder.
@YaserHabib Thanks for reply, the files are already in my PATH though, so should be accessible. and before I copied them, it would throw an error about not finding the files, so it certainly can access it. Torch also commands says it is available too.
Fist simple example: from faster_whisper import WhisperModel
model_size = "large-v3"
model = WhisperModel(model_size, device="cuda", compute_type="float16")
Gives me an error:
Could not load library libcudnn_ops_infer.so.8. Error: libcudnn_ops_infer.so.8: cannot open shared object file: No such file or directory zsh: IOT instruction (core dumped) python test1.py
I have cuDNN 9.2.1.18-1 on manjaro linux. As I understand it wants version 8. When I try to install version 8 with manjaro package manager, it warns me that it will uninstall CUDA 12 and replace it with CUDA 11 that I certainly do not want to do.
If there is something I'm doing wrong, please advise. Otherwise it looks like faster-whisper is not for my server.
same issue using torch 2.4.0 image which use cudnn9, how to make faster whisper compatible with cudnn9? I tried to install from source and still reading cudnn8
so I've been trying to use whisperX but can't get it to work, so decided to test with faster whisper since it is built on that. but I still can't get it working, it only uses CPU. I've also downloaded the 11 and 12 files from here and pasted them in the bin https://github.com/Purfview/whisper-standalone-win/releases/tag/libs