Kungbib / swedish-bert-models

138 stars 8 forks source link

error: Can not find Rust compiler #3

Closed mflodin closed 4 years ago

mflodin commented 4 years ago

Rust needs to be installed for the installation scripts to work. At least on OSX. That should probably be mentioned in the Usage requirements / installation instructions.

marma commented 4 years ago

For the Transformers library? I do not get that on a fairly clean install of OSX 10.15.x. Could you describe your setup in more detail?

mflodin commented 4 years ago

Not sure which requirement caused it, but probably transformers. And now I have Rust installed, so can't reproduce it. =)

Hmm, it might just be missing from older versions of OSX then. I'm still on 10.14.6.

cjrosen commented 4 years ago

Hi, thanks for releasing these models!

I had the same issue when installing. I'm on macOS 10.13.1. Had to install Rust compiler manually, which solved the proplem. This is the output before installing Rust, if it's of any help.

Collecting transformers>=2.4.1
  Downloading transformers-2.5.1-py3-none-any.whl (499 kB)
     |████████████████████████████████| 499 kB 4.9 MB/s 
Collecting torch>=1.3.1
  Downloading torch-1.4.0-cp37-none-macosx_10_9_x86_64.whl (81.1 MB)
     |████████████████████████████████| 81.1 MB 11.6 MB/s 
Collecting filelock
  Using cached filelock-3.0.12-py3-none-any.whl (7.6 kB)
Collecting numpy
  Downloading numpy-1.18.2-cp37-cp37m-macosx_10_9_x86_64.whl (15.1 MB)
     |████████████████████████████████| 15.1 MB 11.0 MB/s 
Collecting sentencepiece
  Downloading sentencepiece-0.1.85-cp37-cp37m-macosx_10_6_x86_64.whl (1.1 MB)
     |████████████████████████████████| 1.1 MB 10.2 MB/s 
Collecting tokenizers==0.5.2
  Downloading tokenizers-0.5.2.tar.gz (64 kB)
     |████████████████████████████████| 64 kB 4.2 MB/s 
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
Collecting sacremoses
  Downloading sacremoses-0.0.38.tar.gz (860 kB)
     |████████████████████████████████| 860 kB 8.4 MB/s 
Collecting tqdm>=4.27
  Downloading tqdm-4.43.0-py2.py3-none-any.whl (59 kB)
     |████████████████████████████████| 59 kB 8.1 MB/s 
Collecting boto3
  Downloading boto3-1.12.27-py2.py3-none-any.whl (128 kB)
     |████████████████████████████████| 128 kB 6.8 MB/s 
Collecting requests
  Using cached requests-2.23.0-py2.py3-none-any.whl (58 kB)
Collecting regex!=2019.12.17
  Downloading regex-2020.2.20.tar.gz (681 kB)
     |████████████████████████████████| 681 kB 6.2 MB/s 
Collecting six
  Using cached six-1.14.0-py2.py3-none-any.whl (10 kB)
Collecting click
  Using cached click-7.1.1-py2.py3-none-any.whl (82 kB)
Collecting joblib
  Downloading joblib-0.14.1-py2.py3-none-any.whl (294 kB)
     |████████████████████████████████| 294 kB 11.5 MB/s 
Collecting s3transfer<0.4.0,>=0.3.0
  Downloading s3transfer-0.3.3-py2.py3-none-any.whl (69 kB)
     |████████████████████████████████| 69 kB 6.9 MB/s 
Collecting jmespath<1.0.0,>=0.7.1
  Downloading jmespath-0.9.5-py2.py3-none-any.whl (24 kB)
Collecting botocore<1.16.0,>=1.15.27
  Downloading botocore-1.15.27-py2.py3-none-any.whl (6.0 MB)
     |████████████████████████████████| 6.0 MB 5.3 MB/s 
Collecting idna<3,>=2.5
  Using cached idna-2.9-py2.py3-none-any.whl (58 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2019.11.28-py2.py3-none-any.whl (156 kB)
Collecting chardet<4,>=3.0.2
  Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached urllib3-1.25.8-py2.py3-none-any.whl (125 kB)
Collecting python-dateutil<3.0.0,>=2.1
  Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting docutils<0.16,>=0.10
  Using cached docutils-0.15.2-py3-none-any.whl (547 kB)
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (PEP 517) ... error
  ERROR: Command errored out with exit status 1:
   command: /Users/cj/CODE/swedish-bert-models/venv/bin/python3 /Users/cj/CODE/swedish-bert-models/venv/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /var/folders/g4/4d1kd6vx0pjffzvk60z9tjk00000gn/T/tmpnq3jswoa
       cwd: /private/var/folders/g4/4d1kd6vx0pjffzvk60z9tjk00000gn/T/pip-install-yufqwhyw/tokenizers
  Complete output (36 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib
  creating build/lib/tokenizers
  copying tokenizers/__init__.py -> build/lib/tokenizers
  creating build/lib/tokenizers/models
  copying tokenizers/models/__init__.py -> build/lib/tokenizers/models
  creating build/lib/tokenizers/decoders
  copying tokenizers/decoders/__init__.py -> build/lib/tokenizers/decoders
  creating build/lib/tokenizers/normalizers
  copying tokenizers/normalizers/__init__.py -> build/lib/tokenizers/normalizers
  creating build/lib/tokenizers/pre_tokenizers
  copying tokenizers/pre_tokenizers/__init__.py -> build/lib/tokenizers/pre_tokenizers
  creating build/lib/tokenizers/processors
  copying tokenizers/processors/__init__.py -> build/lib/tokenizers/processors
  creating build/lib/tokenizers/trainers
  copying tokenizers/trainers/__init__.py -> build/lib/tokenizers/trainers
  creating build/lib/tokenizers/implementations
  copying tokenizers/implementations/byte_level_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/base_tokenizer.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/__init__.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/char_level_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/bert_wordpiece.py -> build/lib/tokenizers/implementations
  copying tokenizers/__init__.pyi -> build/lib/tokenizers
  copying tokenizers/models/__init__.pyi -> build/lib/tokenizers/models
  copying tokenizers/decoders/__init__.pyi -> build/lib/tokenizers/decoders
  copying tokenizers/normalizers/__init__.pyi -> build/lib/tokenizers/normalizers
  copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib/tokenizers/pre_tokenizers
  copying tokenizers/processors/__init__.pyi -> build/lib/tokenizers/processors
  copying tokenizers/trainers/__init__.pyi -> build/lib/tokenizers/trainers
  running build_ext
  running build_rust
  error: Can not find Rust compiler
  ----------------------------------------
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly
marma commented 4 years ago

Thanks for reporting. I will update the documentation.