Open pprobst opened 2 months ago
Hey @pprobst thank you for creating an issue on this and referencing it. It works. I have one question. Do you know how could I build for node, but it should use the GPU (cuda) instead of CPU.
Other problems I found:
In the example index.js
file the source of whisper-node
is wrong. Changed ../../build/Release/whisper-addon
to ../../build/bin/Release/whisper-addon.node
. I had to also download the model, but that is fine.
When I ran the example, I got the following error:
node:internal/util:375
ReflectApply(original, this, args);
^
TypeError: A boolean was expected
I looked into addon.cpp, and I think no_timestamps
is also needed as params. I added it and the example worked.
Hey @pprobst thank you for creating an issue on this and referencing it. It works. I have one question. Do you know how could I build for node, but it should use the GPU (cuda) instead of CPU.
Hmm, to be fair in my use case I have only been using the node addon with CPU, and I'm not very experienced with node myself. Do you mean that it's not compiling for GPU?
Other problems I found: In the example
index.js
file the source ofwhisper-node
is wrong. Changed../../build/Release/whisper-addon
to../../build/bin/Release/whisper-addon.node
. I had to also download the model, but that is fine.When I ran the example, I got the following error:
node:internal/util:375 ReflectApply(original, this, args); ^ TypeError: A boolean was expected
I looked into addon.cpp, and I think
no_timestamps
is also needed as params. I added it and the example worked.
Correct, no_timestamps
should be passed as a parameter. In fact, I added the no_timestamps
option recently as a PR 😅, so the existing example is outdated. I think it's important enough to be a parameter, since there's quite a big difference in WER when it comes to using or not timestamps (see https://github.com/ggerganov/whisper.cpp/issues/1724#issuecomment-1880142000).
Hey @pprobst I meant using GPU and CUDA, I am not sure how that works. What to do to speed up the process. My goal is to create text as fast as possible for few second recordings. I am trying to record air traffic control communication and turn it into text. I have tried using smart-whisper
npm library with base.en
model but it is slow and messages fall behind. It is good in a sense that it loads the model only once so I don't wase time with that, but also the quality is pretty inaccurate. I run windows and RTX4090, Procesor AMD Ryzen 9 7950X.
You can set use_gpu
to true
, but it's supposed to be true
by default, so you're already using GPU if you have CUDA. Check your outputs and GPU usage to see if it's using your GPU.
EDIT: made a PR #2115.
Other problems I found: In the example
index.js
file the source ofwhisper-node
is wrong. Changed../../build/Release/whisper-addon
to../../build/bin/Release/whisper-addon.node
. I had to also download the model, but that is fine.When I ran the example, I got the following error:
node:internal/util:375 ReflectApply(original, this, args); ^ TypeError: A boolean was expected
I looked into addon.cpp, and I think
no_timestamps
is also needed as params. I added it and the example worked.
The path is still incorrect in the most recent commit. This still needs to be fixed. When I changed it to ../../build/bin/Release/addon.node
as suggested, it worked.
I cannot replicate this error here. For reference, what I do is:
examples/addon.node
, I run npm install
.npx cmake-js compile -T addon.node -B Release
.cd examples/addon.node
again and run node index.js
.
Hello there.
When I compile
addon.node
in examples, running the following line afternpm install
:Yields:
However, this can be fixed by simply changing https://github.com/ggerganov/whisper.cpp/blob/858452d58dba3acdc3431c9bced2bb8cfd9bf418/examples/CMakeLists.txt#L81
to
set_target_properties(whisper-addon PROPERTIES FOLDER "examples")
However, I suppose a cleaner solution would be to replace the occurrences of
whisper-addon
withaddon.node
, to maintain the intended naming scheme.@ggerganov Let me know your preference and I can shoot a PR if needed.