Open GinaMhd opened 6 months ago
First Question, do you have a rust compiler installed on your remote server? Otherwise, do so by running this:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Rust Installation - Official Docs This should fix your error, because Wheel is built in Rust.
Otherwise, it might be an issue with your python version which has issues for Wheel: You could try using a venv with python version 3.9. It is a issue by python/transformers, where the newer Rust/Wheels dependencies are no longer correctly implemented and thus failing a rust build.
Im trying to install this on my remote server, and when I want to run this line of code: pip3 install -r requirements.txt
I get this error:
Building wheels for collected packages: tokenizers Building wheel for tokenizers (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for tokenizers (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [51 lines of output] running bdist_wheel running build running build_py creating build creating build/lib.linux-x86_64-cpython-311 creating build/lib.linux-x86_64-cpython-311/tokenizers copying py_src/tokenizers/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers creating build/lib.linux-x86_64-cpython-311/tokenizers/models copying py_src/tokenizers/models/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/models creating build/lib.linux-x86_64-cpython-311/tokenizers/decoders copying py_src/tokenizers/decoders/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders creating build/lib.linux-x86_64-cpython-311/tokenizers/normalizers copying py_src/tokenizers/normalizers/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers creating build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers copying py_src/tokenizers/pre_tokenizers/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers creating build/lib.linux-x86_64-cpython-311/tokenizers/processors copying py_src/tokenizers/processors/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/processors creating build/lib.linux-x86_64-cpython-311/tokenizers/trainers copying py_src/tokenizers/trainers/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers creating build/lib.linux-x86_64-cpython-311/tokenizers/implementations copying py_src/tokenizers/implementations/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations creating build/lib.linux-x86_64-cpython-311/tokenizers/tools copying py_src/tokenizers/tools/init.py -> build/lib.linux-x86_64-cpython-311/tokenizers/tools copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-cpython-311/tokenizers/tools copying py_src/tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers copying py_src/tokenizers/models/init.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/models copying py_src/tokenizers/decoders/init.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders copying py_src/tokenizers/normalizers/init.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers copying py_src/tokenizers/pre_tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers copying py_src/tokenizers/processors/init.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/processors copying py_src/tokenizers/trainers/init.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-cpython-311/tokenizers/tools running build_ext running build_rust error: can't find Rust compiler
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for tokenizers Failed to build tokenizers ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
Any recommendation is really appreciated.