Open RansSelected opened 3 years ago
Do you have rust installed? See, setup tools rust is a way to bridge between rust and python. So if it is trying to do anything that requires rust compiler itself and it does not find one then it will error out.
On Wed, 3 Mar, 2021, 9:28 pm RandomNameSelect, notifications@github.com wrote:
Hi! while tryoing to
pip install docly, getting this error:
Building wheel for tokenizers (PEP 517) ... error ERROR: Command errored out with exit status 1: command: /Users/krisku/opt/miniconda3/envs/auto_doc/bin/python /Users/krisku/opt/miniconda3/envs/auto_doc/lib/python3.9/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /var/folders/mr/5nhwtm3n4dvfdkh3sf2xs41h0000gn/T/tmpb3b95crr cwd: /private/var/folders/mr/5nhwtm3n4dvfdkh3sf2xs41h0000gn/T/pip-install-syhlo38v/tokenizers_eb68e965c7654f6bbe50cefd6fff55af Complete output (36 lines): running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/tokenizers copying tokenizers/init.py -> build/lib/tokenizers creating build/lib/tokenizers/models copying tokenizers/models/init.py -> build/lib/tokenizers/models creating build/lib/tokenizers/decoders copying tokenizers/decoders/init.py -> build/lib/tokenizers/decoders creating build/lib/tokenizers/normalizers copying tokenizers/normalizers/init.py -> build/lib/tokenizers/normalizers creating build/lib/tokenizers/pre_tokenizers copying tokenizers/pre_tokenizers/init.py -> build/lib/tokenizers/pre_tokenizers creating build/lib/tokenizers/processors copying tokenizers/processors/init.py -> build/lib/tokenizers/processors creating build/lib/tokenizers/trainers copying tokenizers/trainers/init.py -> build/lib/tokenizers/trainers creating build/lib/tokenizers/implementations copying tokenizers/implementations/byte_level_bpe.py -> build/lib/tokenizers/implementations copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib/tokenizers/implementations copying tokenizers/implementations/base_tokenizer.py -> build/lib/tokenizers/implementations copying tokenizers/implementations/init.py -> build/lib/tokenizers/implementations copying tokenizers/implementations/char_level_bpe.py -> build/lib/tokenizers/implementations copying tokenizers/implementations/bert_wordpiece.py -> build/lib/tokenizers/implementations copying tokenizers/init.pyi -> build/lib/tokenizers copying tokenizers/models/init.pyi -> build/lib/tokenizers/models copying tokenizers/decoders/init.pyi -> build/lib/tokenizers/decoders copying tokenizers/normalizers/init.pyi -> build/lib/tokenizers/normalizers copying tokenizers/pre_tokenizers/init.pyi -> build/lib/tokenizers/pre_tokenizers copying tokenizers/processors/init.pyi -> build/lib/tokenizers/processors copying tokenizers/trainers/init.pyi -> build/lib/tokenizers/trainers running build_ext running build_rust error: Can not find Rust compiler
ERROR: Failed building wheel for tokenizers Failed to build tokenizers ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly
even though the setuptools-rust installed .
MacOS Catalina 10.15.7 Python 3.9.1 conda 4.9.2 + pip 21.0.1
I'll appreciate your help.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/autosoft-dev/docly/issues/4, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABIZ56APCZ3YMI4DSCXN6MDTBZMBFANCNFSM4YRPVH7Q .
I have the same issue.
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (PEP 517) ... error
ERROR: Command errored out with exit status 1:
command: /Users/aabur/.asdf/installs/python/3.9.2/bin/python /Users/aabur/.asdf/installs/python/3.9.2/lib/python3.9/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /var/folders/9h/cmzj5d8j247_lyg2h4jsm7_80000gn/T/tmpll84tz3q
cwd: /private/var/folders/9h/cmzj5d8j247_lyg2h4jsm7_80000gn/T/pip-install-9r0vuary/tokenizers_31b0f24eb85e40d58ab1580a2ccf79bd
Complete output (226 lines):
...
...
...
----------------------------------------
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly
rust
and setuptools-rust
installed
macOS Big Sur 11.2.3 Python 3.9.2 pip 21.0.1
Could you guys resolve this?
I followed instructions for the same issue from another repo and it worked for me:
https://github.com/huggingface/transformers/issues/2831#issuecomment-600141935
Hi! while tryoing to
pip install docly, getting this error:
even though the setuptools-rust installed .
MacOS Catalina 10.15.7 Python 3.9.1 conda 4.9.2 + pip 21.0.1
I'll appreciate your help.