EvelynFan / FaceFormer

[CVPR 2022] FaceFormer: Speech-Driven 3D Facial Animation with Transformers
MIT License
778 stars 133 forks source link

transformers problem #29

Open lucasjinreal opened 2 years ago

lucasjinreal commented 2 years ago
transformers/models/wav2vec2/modeling_wav2vec2.py", line 387, in forward
    hidden_states = hidden_states.transpose(1, 2)
AttributeError: 'tuple' object has no attribute 'transpose'

does it possible to make code compatible with latest transformers?

lucasjinreal commented 2 years ago
Requirement already satisfied: six in /usr/lib/python3/dist-packages (from sacremoses->transformers==4.6.1) (1.16.0)
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [50 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.linux-x86_64-3.10
      creating build/lib.linux-x86_64-3.10/tokenizers
      copying py_src/tokenizers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers
      creating build/lib.linux-x86_64-3.10/tokenizers/models
      copying py_src/tokenizers/models/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/models
      creating build/lib.linux-x86_64-3.10/tokenizers/decoders
      copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/decoders
      creating build/lib.linux-x86_64-3.10/tokenizers/normalizers
      copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/normalizers
      creating build/lib.linux-x86_64-3.10/tokenizers/pre_tokenizers
      copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/pre_tokenizers
      creating build/lib.linux-x86_64-3.10/tokenizers/processors
      copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/processors
      creating build/lib.linux-x86_64-3.10/tokenizers/trainers
      copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/trainers
      creating build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-3.10/tokenizers/implementations
      creating build/lib.linux-x86_64-3.10/tokenizers/tools
      copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-3.10/tokenizers/tools
      copying py_src/tokenizers/tools/__init__.py -> build/lib.linux-x86_64-3.10/tokenizers/tools
      copying py_src/tokenizers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers
      copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/models
      copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/decoders
      copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/normalizers
      copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/pre_tokenizers
      copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/processors
      copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-3.10/tokenizers/trainers
      copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-3.10/tokenizers/tools
      running build_ext
      error: can't find Rust compiler

      If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.

      To update pip, run:

          pip install --upgrade pip

      and then retry package installation.

      If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
JSHZT commented 1 year ago

I also encountered this problem, is there a solution?Line 114 of wav2vec.py will change "hidden_state" from a tensor to a tuple, and the tuple has no transpose method, could that be the problem here?

Baroquestc commented 1 year ago

Have you solved this problem?

eventhorizon02 commented 1 year ago

I have the same problem, any solution?

JSHZT commented 1 year ago

I have the same problem, any solution?

Sorry I haven't solved this yet

JSHZT commented 1 year ago

I think you can try to keep the version consistent with the author

Alpe6825 commented 1 year ago

I had the same problem when I installed a wrong version of the transformer package. With the version recommended by the author it works fine.

FinallyKiKi commented 1 year ago

change hidden_states = self.feature_projection(hidden_states) to hidden_states = self.feature_projection(hidden_states)[0]

it's useful on transformers==4.26.1