diffusionstudio / vits-web

Web api for using VITS based models in the browser!
https://huggingface.co/spaces/diffusionstudio/vits-web
155 stars 15 forks source link

Gpu inference #3

Open puppetm4st3r opened 3 months ago

puppetm4st3r commented 3 months ago

is there a way to use webgpu or multithread cpu inference? or that depends on the browser/os host? regards

k9p5 commented 3 months ago

WebGPU browser compatibility is still rather limited, however it should most definitely be possible. I'll be working on it asap.

puppetm4st3r commented 3 months ago

thanks, and for cpu inference? it uses multithread inference? when i try raw onxx on wasm cannot do it with multithread inference, no errors just did not do anything

k9p5 commented 3 months ago

Yes it can be multithreaded but you need to enable cross-origin isolation.

C-Loftus commented 6 days ago

This is a great idea and would be very useful.

I would like to look into this or help you out. Have there been any updates on this or pointers for where to start? I wasn't sure how much of this needs to be done in WebGPU via JS or inside the WebAssembly. I also wasn't sure whether or not this will touch the build process in the piper wasm repo.