A GLaDOS TTS, using Forward Tacotron and HiFiGAN. Inference is fast and stable, even on the CPU. A low quality vocoder model is included for mobile use. Rudimentary TTS script included. Works perfectly on Linux, partially on Maybe someone smarter than me can make a GUI.
Hey there, just wanted to discuss about potentials improvement for downloading the models.
For my integration of your tts in homeassistant wyoming-glados I created a script that downloads the models for this repo here. Instead of google drive it uses github releases which allows unlimited downloads. Also getting the project running would be a bit easier for users as they would only have to
Instead of manually downloading and moving the files from google drive.
I would be happy to submit a pr including the script and adjustments to the readme if these changes are welcome. You would need create a github release for this repo and add the model files like here though.
Awesome! Go ahead. I never expected to get so much support from random people on the internet on this project but its nice because I'm pretty busy with classes and stuff.
Perfect ! I will submit a pr next time I have some free time. If you can create a release with the model files like here I will edit the hardcoded URL in download.py to your repo instead of mine
Hey there, just wanted to discuss about potentials improvement for downloading the models.
For my integration of your tts in homeassistant wyoming-glados I created a script that downloads the models for this repo here. Instead of google drive it uses github releases which allows unlimited downloads. Also getting the project running would be a bit easier for users as they would only have to
git clone https://github.com/R2D2FISH/glados-tts/ cd glados-tts python3 download.py
Instead of manually downloading and moving the files from google drive.
I would be happy to submit a pr including the script and adjustments to the readme if these changes are welcome. You would need create a github release for this repo and add the model files like here though.