aedocw / epub2tts

Turn an epub or text file into an audiobook
Apache License 2.0
433 stars 43 forks source link

xtts_v2 not working in docker, fail at license confrim #139

Closed rejuce closed 5 months ago

rejuce commented 6 months ago

when going the docker route: the script fails at the point one has to confirm the license

Total characters: 51254 Engine is TTS, model is tts_models/multilingual/multi-dataset/xtts_v2

tts_models/multilingual/multi-dataset/xtts_v2 has been updated, clearing model cache... You must confirm the following: | > "I have purchased a commercial license from Coqui: licensing@coqui.ai" | > "Otherwise, I agree to the terms of the non-commercial CPML: https://coqui.ai/cpml" - [y/n] | | > Traceback (most recent call last): File "/opt/epub2tts/epub2tts.py", line 654, in main() File "/opt/epub2tts/epub2tts.py", line 643, in main mybook.read_book( File "/opt/epub2tts/epub2tts.py", line 349, in read_book self.tts = TTS(model_name).to(self.device) File "/usr/local/lib/python3.10/dist-packages/TTS/api.py", line 74, in init self.load_tts_model_by_name(model_name, gpu) File "/usr/local/lib/python3.10/dist-packages/TTS/api.py", line 171, in load_tts_model_by_name model_path, config_path, vocoder_path, vocoder_config_path, model_dir = self.download_model_by_name( File "/usr/local/lib/python3.10/dist-packages/TTS/api.py", line 129, in download_model_by_name model_path, config_path, model_item = self.manager.download_model(model_name) File "/usr/local/lib/python3.10/dist-packages/TTS/utils/manage.py", line 400, in download_model self.create_dir_and_download_model(model_name, model_item, output_path) File "/usr/local/lib/python3.10/dist-packages/TTS/utils/manage.py", line 337, in create_dir_and_download_model if not self.ask_tos(output_path): File "/usr/local/lib/python3.10/dist-packages/TTS/utils/manage.py", line 316, in ask_tos answer = input(" | | > ") EOFError: EOF when reading a line

aedocw commented 6 months ago

Thank you for logging this bug. I had not run across it because I had already agreed to the license previously.

I'll document the steps necessary to launch the container interactively the first time in order to agree to the license.

rejuce commented 6 months ago

here on what to do:

  1. run docker interactively, override the entry point and start bash

    docker run -it --entrypoint bash -v ${PWD}/.local/share/tts:/root/.local/share/tts -v ${PWD}
    :/root -w /root ghcr.io/aedocw/epub2tts:release
  2. run some speech sythesis with xtts_v2 model inside the container and confirm licesne: tts --text "hello world. this is test of speech sythesis in english. do you like it?" --model_name "tts_models/multilingual/multi-dataset/xtts_v2" --out_path /root/test.wav

rejuce commented 6 months ago

that was the idea I had as well...now inside the container in seems to remember it...but unfortunately if I run it again with the command like in your doc...I get the same error again.

possible that the license agreed flag is set differntetly depending tts is called from python api or cli?

Edit: guess i am just not fit enough with docker...that docker run epub2tts...creates differnt container that the one where i change the entry point...duno right now how to workaround

aedocw commented 6 months ago

I'll try to see what's happening. Ultimately once you agree to the license terms, in the XTTS model directory there will be a file created .local/share/tts/tts_models--multilingual--multi-dataset--xtts_v2/tos_agreed.txt that contains "I have read, understood and agreed to the Terms and Conditions." Since that directory is always mapped to a local spot (so it does not have to re-download the model every time), once you agree to the TOS it should be sticky.

For what it's worth, it looks to me at a glance anyway that the way you called the container should have worked.

aedocw commented 6 months ago

Looking into this more closely but it's possible launching with "--gpus=all -e COQUI_TOS_AGREED=1" might work (that's what you launch ghcr.io/coqui-ai/xtts-streaming-server:latest-cuda121 with). Mostly adding this now as a note to myself :)