Open HUAZEC opened 9 months ago
As indicated in the "Use external whisper shared library." section of the readme you can use your own whisper build. I don't know whetter the different acceleration methods will work as I didn't tested them and I don't think I'll have time to do it in the near future.
Now that I have changed to use the whisper.cpp cmake build it should be easy to add built-in support for some of those, but as I told you I don't have the need nether the time, so it's not something I will work on right now. Also the aim of the library was to have a library in maven to run whisper out of the box on the supported platforms, it was never intended to include GPU acceleration support.
Tested with whisper.cpp 1.6.2 under Linux Debian 12, CUDA version 12.2.
Clone whisper.cpp repository, then compile with the following command:
WHISPER_CUDA=1 make libwhisper.so
Then pass the path to the compiled library during initialization:
val loadOptions = LoadOptions()
loadOptions.whisperLib = Paths.get("/path/to/your/libwhisper.so")
WhisperJNI.loadLibrary(loadOptions)
Can I use GPU now