Closed keith555 closed 1 month ago
Compare transcription speed with --device cpu
.
I don't understand your previous comment. It runs much faster on the GPU. When I transcribed the file with Whisper Desktop which uses the GPU, it finished in under 10 minutes. When I ran it using this version, which runs on the CPU, it took almost an hour.
I meant compare transcription speed of Standalone Faster-Whisper with --device cpu
vs --device cuda
.
I'm not interested in Whisper Desktop, whatever that is.
-cuda 110 seconds -cpu killed it after 30 minutes and it had transcribed three lines -Whisper Desktop 48 seconds
-cuda 110 seconds -cpu killed it after 30 minutes and it had transcribed three lines
Obviously it runs on GPU.
I installed it automatically via subtitle-edit. I noticed that transcriptions were running on the CPU, not the GPU. I ran it from the command line with the same results. I've tried several different models. The output says that it is "running on CUDA" but the GPU usage remains minimal. I tried explicitly setting cuda with
--device: cuda.
I also tried--device cuda:1
in case there was some problem with the integrated graphics card, but that failed withinvalid device ordinal
. If I transcribe the same file using Whisper Desktop, it runs on the GPU. The GPU is an RTX 4080 running on a Gigabyte Auros 17H.Using Whisper Desktop: