Closed guangyuli-uoe closed 2 years ago
Hi @guangyuli-uoe, as this seems to be an issue with sentence-transformers
, perhaps this thread on their repository might help resolve the issue! Btw if this is related to the running of the BUCC task, this library should not be required.
hi @heffernankevin
really thanks for your useful link !
when i tried to run the command ./bucc.sh it seems that i need to firstly run the 'embed' task manually, cause: FileNotFoundError: [Errno 2] No such file or directory: './embed/bucc2018.fr-en.train.enc.fr'
Processing BUCC data in .
Closing as user managed to install sentence-transformers successfully (https://github.com/facebookresearch/LASER/issues/211#issuecomment-1189607247) and comment regarding an issue running the bucc task has also been referenced in another issue (see https://github.com/facebookresearch/LASER/issues/209), so will follow there.
ERROR: Command errored out with exit status 1: command: /Users/liguangyu/opt/anaconda3/envs/laserr/bin/python /Users/liguangyu/opt/anaconda3/envs/laserr/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /var/folders/f3/8jrjgbyj6rb39q8f6hm9fmzr0000gn/T/tmpdkv8cntm cwd: /private/var/folders/f3/8jrjgbyj6rb39q8f6hm9fmzr0000gn/T/pip-install-5xda94xt/tokenizers_754c49cea1724635a231beb70759f8bd Complete output (51 lines): running bdist_wheel running build running build_py creating build creating build/lib.macosx-10.7-x86_64-3.6 creating build/lib.macosx-10.7-x86_64-3.6/tokenizers copying py_src/tokenizers/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/models copying py_src/tokenizers/models/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/models creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/decoders copying py_src/tokenizers/decoders/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/decoders creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/normalizers copying py_src/tokenizers/normalizers/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/normalizers creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/pre_tokenizers copying py_src/tokenizers/pre_tokenizers/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/pre_tokenizers creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/processors copying py_src/tokenizers/processors/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/processors creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/trainers copying py_src/tokenizers/trainers/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/trainers creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations copying py_src/tokenizers/implementations/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/implementations creating build/lib.macosx-10.7-x86_64-3.6/tokenizers/tools copying py_src/tokenizers/tools/init.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/tools copying py_src/tokenizers/tools/visualizer.py -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/tools copying py_src/tokenizers/init.pyi -> build/lib.macosx-10.7-x86_64-3.6/tokenizers copying py_src/tokenizers/models/init.pyi -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/models copying py_src/tokenizers/decoders/init.pyi -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/decoders copying py_src/tokenizers/normalizers/init.pyi -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/normalizers copying py_src/tokenizers/pre_tokenizers/init.pyi -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/pre_tokenizers copying py_src/tokenizers/processors/init.pyi -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/processors copying py_src/tokenizers/trainers/init.pyi -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/trainers copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.macosx-10.7-x86_64-3.6/tokenizers/tools running build_ext running build_rust error: can't find Rust compiler
If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
To update pip, run:
and then retry package installation.
If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
ERROR: Failed building wheel for tokenizers Failed to build tokenizers ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects