suno-ai / bark

🔊 Text-Prompted Generative Audio Model
MIT License
36.23k stars 4.26k forks source link

`from transformers import AutoProcessor, BarkModel` failed #590

Open littlebowlnju opened 3 months ago

littlebowlnju commented 3 months ago

I use transformers to utilize Bark, however, my program keeps failing for the following error:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1535, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 97, in <module>
    from accelerate.hooks import AlignDevicesHook, add_hook_to_module
  File "/usr/local/lib/python3.10/dist-packages/accelerate/__init__.py", line 3, in <module>
    from .accelerator import Accelerator
  File "/usr/local/lib/python3.10/dist-packages/accelerate/accelerator.py", line 35, in <module>
    from .checkpointing import load_accelerator_state, load_custom_state, save_accelerator_state, save_custom_state
  File "/usr/local/lib/python3.10/dist-packages/accelerate/checkpointing.py", line 24, in <module>
    from .utils import (
  File "/usr/local/lib/python3.10/dist-packages/accelerate/utils/__init__.py", line 150, in <module>
    from .launch import (
  File "/usr/local/lib/python3.10/dist-packages/accelerate/utils/launch.py", line 32, in <module>
    from ..utils.other import is_port_in_use, merge_dicts
  File "/usr/local/lib/python3.10/dist-packages/accelerate/utils/other.py", line 36, in <module>
    from .transformer_engine import convert_model
  File "/usr/local/lib/python3.10/dist-packages/accelerate/utils/transformer_engine.py", line 21, in <module>
    import transformer_engine.pytorch as te
  File "/usr/local/lib/python3.10/dist-packages/transformer_engine/pytorch/__init__.py", line 11, in <module>
    from .attention import DotProductAttention
  File "/usr/local/lib/python3.10/dist-packages/transformer_engine/pytorch/attention.py", line 57, in <module>
    _flash_attn_version = packaging.version.Version(version("flash-attn"))
  File "/usr/lib/python3.10/importlib/metadata/__init__.py", line 996, in version
    return distribution(distribution_name).version
  File "/usr/lib/python3.10/importlib/metadata/__init__.py", line 969, in distribution
    return Distribution.from_name(distribution_name)
  File "/usr/lib/python3.10/importlib/metadata/__init__.py", line 548, in from_name
    raise PackageNotFoundError(name)
importlib.metadata.PackageNotFoundError: No package metadata was found for flash-attn

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1535, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/bark/modeling_bark.py", line 31, in <module>
    from ...modeling_utils import PreTrainedModel, get_parameter_device
  File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 45, in <module>
    from .generation import GenerationConfig, GenerationMixin
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1525, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1537, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
No package metadata was found for flash-attn

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "***.py, line 18, in <module>
    from .tts.bark_local_service import BarkTTSService
  File "***.py", line 5, in <module>
    from transformers import AutoProcessor, BarkModel
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1526, in __getattr__
    value = getattr(module, name)
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1525, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1537, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.bark.modeling_bark because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
No package metadata was found for flash-attn

I'm using V100, which does not support flash-attention, thus i didn't install flash-attention on my device. However, I believe bark should be able to be imported and used without flash-attention? I have no idea what is going on.

My environment:

torch==2.3.0a0+ebedce2
transformers==4.41.2
cuda version=12.3