erew123 / alltalk_tts

AllTalk is based on the Coqui TTS engine, similar to the Coqui_tts extension for Text generation webUI, however supports a variety of advanced features, such as a settings page, low VRAM support, DeepSpeed, narrator, model finetuning, custom models, wav file maintenance. It can also be used with 3rd Party software via JSON calls.
GNU Affero General Public License v3.0
686 stars 71 forks source link

Intel arc gpu support #240

Closed TrinityTF closed 1 month ago

TrinityTF commented 1 month ago

Would be nice to have this work with Intel Arc gpus (a750, a770) using ipex or vulkan instead of cuda.

Links to stuff related to this: https://github.com/intel/intel-extension-for-deepspeed https://github.com/intel/intel-extension-for-pytorch https://www.intel.com/content/www/us/en/developer/tools/oneapi/overview.html https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html https://www.lunarg.com/vulkan-sdk/ https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html

erew123 commented 1 month ago

Hi @TrinityTF

I am about to release v2 of AllTalk, at least in BETA https://github.com/erew123/alltalk_tts/discussions/237 and https://github.com/erew123/alltalk_tts/discussions/211

V2 will support multiple TTS engines, all with varying capabilities and GPU support.

Because of how I have broken out the individual TTS engines loaders, this means it should be far easier for people to setup/code additional GPU support (where possible). Because I dont have an Intel GPU, AMD GPU or Mac M series computer to test with, I am going to put out a request for anyone with a bit of coding experience (and the correct type systems) to take a shot at adding support (where possible).

I've added this ticket to the Feature requests

Thanks

erew123 commented 1 month ago

Corrected some spelling mistakes in my earlier message....