coqui-ai / TTS

πŸΈπŸ’¬ - a deep learning toolkit for Text-to-Speech, battle-tested in research and production
http://coqui.ai
Mozilla Public License 2.0
35.7k stars 4.37k forks source link

[Bug] XTTS V2 Handle error #3184

Closed Lenos500 closed 12 months ago

Lenos500 commented 1 year ago

Describe the bug

When trying to duplicate the huggingface.co demo for xttsv2 in google colab it gives the following error upon usage

2023-11-09 18:02:55.167040: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2023-11-09 18:02:55.167099: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2023-11-09 18:02:55.167140: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered RuntimeError: module compiled against API version 0x10 but this version of numpy is 0xf ImportError: numpy.core._multiarray_umath failed to import ImportError: numpy.core.umath failed to import Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1184, in _get_module return importlib.import_module("." + module_name, self.name) File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in _find_and_load_unlocked File "", line 688, in _load_unlocked File "", line 883, in exec_module File "", line 241, in _call_with_frames_removed File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 27, in from ..integrations.deepspeed import is_deepspeed_zero3_enabled File "/usr/local/lib/python3.10/dist-packages/transformers/integrations/init.py", line 21, in from .deepspeed import ( File "/usr/local/lib/python3.10/dist-packages/transformers/integrations/deepspeed.py", line 29, in from ..optimization import get_scheduler File "/usr/local/lib/python3.10/dist-packages/transformers/optimization.py", line 27, in from .trainer_utils import SchedulerType File "/usr/local/lib/python3.10/dist-packages/transformers/trainer_utils.py", line 49, in import tensorflow as tf File "/usr/local/lib/python3.10/dist-packages/tensorflow/init.py", line 38, in from tensorflow.python.tools import module_util as _module_util File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/init.py", line 42, in from tensorflow.python.saved_model import saved_model File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/saved_model/saved_model.py", line 20, in from tensorflow.python.saved_model import builder File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/saved_model/builder.py", line 23, in from tensorflow.python.saved_model.builder_impl import _SavedModelBuilder File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/saved_model/builder_impl.py", line 26, in from tensorflow.python.framework import dtypes File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/dtypes.py", line 37, in _np_bfloat16 = pywrap_ml_dtypes.bfloat16() TypeError: Unable to convert function return value to a Python type! The signature was () -> handle

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1184, in _get_module return importlib.import_module("." + module_name, self.name) File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in _find_and_load_unlocked File "", line 688, in _load_unlocked File "", line 883, in exec_module File "", line 241, in _call_with_frames_removed File "/usr/local/lib/python3.10/dist-packages/transformers/models/gpt2/modeling_gpt2.py", line 38, in from ...modeling_utils import PreTrainedModel, SequenceSummary File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 39, in from .generation import GenerationConfig, GenerationMixin File "", line 1075, in _handle_fromlist File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1174, in getattr module = self._get_module(self._class_to_module[name]) File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1186, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback): Unable to convert function return value to a Python type! The signature was () -> handle

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "app.py", line 28, in from TTS.tts.configs.xtts_config import XttsConfig File "/usr/local/lib/python3.10/dist-packages/TTS/tts/configs/xtts_config.py", line 5, in from TTS.tts.models.xtts import XttsArgs, XttsAudioConfig File "/usr/local/lib/python3.10/dist-packages/TTS/tts/models/xtts.py", line 12, in from TTS.tts.layers.xtts.gpt import GPT File "/usr/local/lib/python3.10/dist-packages/TTS/tts/layers/xtts/gpt.py", line 12, in from TTS.tts.layers.xtts.gpt_inference import GPT2InferenceModel File "/usr/local/lib/python3.10/dist-packages/TTS/tts/layers/xtts/gpt_inference.py", line 5, in from transformers import GPT2PreTrainedModel File "", line 1075, in _handle_fromlist File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1175, in getattr value = getattr(module, name) File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1174, in getattr module = self._get_module(self._class_to_module[name]) File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1186, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.models.gpt2.modeling_gpt2 because of the following error (look up to see its traceback): Failed to import transformers.generation.utils because of the following error (look up to see its traceback): Unable to convert function return value to a Python type! The signature was () -> handle

To Reproduce

Try to duplicate the space or run it locally.

Expected behavior

No response

Logs

No response

Environment

Google colab with all requirements installed (Using a T4 gpu).

Additional context

No response

neurowelt commented 1 year ago

Seems like a numpy version issue. Installing to 1.24.0 fixed it for me:

pip install numpy==1.24.0
neurowelt commented 1 year ago

Seems as the tagged version that's installable from PyPi has this line in the requirements: https://github.com/coqui-ai/TTS/blob/46d9c27212939aa54b22f9df842c753de67b1f34/requirements.txt#L2

And since currently Colab uses Python 3.10 as default, it installs numpy==1.22.0 which causes this issue. Since 1.24.0 works fine perhaps someone from the team could change that line in the requirements.txt 😊

Just so you know, installing from dev branch won't fix that either, as this requirement is there too.

Lenos500 commented 1 year ago

Seems like a numpy version issue. Installing to 1.24.0 fixed it for me:

pip install numpy==1.24.0

Thanks this worked for meπŸ‘