Open momobobe opened 1 year ago
Thanks @momobobe for your suggestion.
I also thought of packaging the app, but the problem is it is based on other models and you will need to package all dependencies like Pytorch, which will results in huge installer file and it may trigger antiviruses.
Turn it into Electron or Tauri does not need little effort unfortunately! PRs are welcome if you want to try!
but the problem is it is based on other models and you will need to package all dependencies like Pytorch
Yeah, but that's exactly the problem for most ordinary end users, as installing Pytorch is more than challenging if one has never done it before.
Turn it into Electron or Tauri does not need little effort unfortunately!
Well, I get it. Maybe we need more help from some experienced guys
Yes each has its own disadvantages . I will leave the issue open in case someone has a better idea.
Thanks @momobobe again.
One more post just found https://www.reddit.com/r/electronjs/comments/12rsdn5/shipping_large_ml_models_with_electron_a/
Thanks @momobobe for the resources. Yes ONNX seems the way to go if you want to ship ML models to consumer devices. However, the pre-trained models needs to be converted to the ONNX format, which is not straightforward. There are some people who tried to convert the original whisper, but what about the other variants, whisper.cpp for example or faster-whisper!
Electron seems also an option, I will take a look at the post
Containerize it. This is practically begging for it.
This is a perfect use case for containerization.
Docker was made for situations like this.
Yes I agree @ninjamonkey198206, I really wanted to dockerize it, I just didn't find some time for it yet. I hope I will do it soon. Thanks!
I've been having issues pulling subtitles for things so when I found this I was super excited.
Looking forward to it getting fleshed out and giving it a shot.
Here you go @ninjamonkey198206, You can use docker to run the web-ui. Keep in mind though that the resulting image is relatively large (around 8GB) because of Pytorch and the other dependencies.
Woohoo!
I have a 1TB nvme mirror for OS vhds and containers.
The space doesn't bother me too much, but containerizing it makes it so much easier to test and play with.
You are awesome!
For ordinary users, pip is still not friendly enough for installation. Maybe it's better to pack with pyinstaller for direct distribution, or add a little more effort to turn this into an Electron or Tauri App, so the project can approach a larger user group. https://github.com/chidiwilliams/buzz has done it, but it's more oriented to transcription while less to caption, so such a job may be still worthwhile here.