fhamborg / NewsMTSC

Target-dependent sentiment classification in news articles reporting on political events. Includes a high-quality data set of over 11k sentences and a state-of-the-art classification model.
Other
140 stars 21 forks source link

Can't install library on Python 3.9.0 #26

Closed antongolubev5 closed 11 months ago

antongolubev5 commented 1 year ago

error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [11 lines of output] C:\Users\anton\AppData\Local\Programs\Python\Python39\lib\site-packages\setuptools\dist.py:771: UserWarning: Usage of dash-separated 'description-file' will not be supported in f uture versions. Please use the underscore name 'description_file' instead warnings.warn( running bdist_wheel running build running build_py creating build creating build\lib.win-amd64-cpython-39 copying sentencepiece.py -> build\lib.win-amd64-cpython-39 running build_ext building '_sentencepiece' extension error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/ [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for sentencepiece Running setup.py clean for sentencepiece Building wheel for tokenizers (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for tokenizers (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [47 lines of output] running bdist_wheel running build running build_py creating build creating build\lib.win-amd64-cpython-39 creating build\lib.win-amd64-cpython-39\tokenizers copying py_src\tokenizers__init.py -> build\lib.win-amd64-cpython-39\tokenizers creating build\lib.win-amd64-cpython-39\tokenizers\models copying py_src\tokenizers\models__init.py -> build\lib.win-amd64-cpython-39\tokenizers\models creating build\lib.win-amd64-cpython-39\tokenizers\decoders copying py_src\tokenizers\decoders__init.py -> build\lib.win-amd64-cpython-39\tokenizers\decoders creating build\lib.win-amd64-cpython-39\tokenizers\normalizers copying py_src\tokenizers\normalizers__init.py -> build\lib.win-amd64-cpython-39\tokenizers\normalizers creating build\lib.win-amd64-cpython-39\tokenizers\pre_tokenizers copying py_src\tokenizers\pre_tokenizers__init.py -> build\lib.win-amd64-cpython-39\tokenizers\pre_tokenizers creating build\lib.win-amd64-cpython-39\tokenizers\processors copying py_src\tokenizers\processors__init__.py -> build\lib.win-amd64-cpython-39\tokenizers\processors creating build\lib.win-amd64-cpython-39\tokenizers\trainers copying py_src\tokenizers\trainers__init__.py -> build\lib.win-amd64-cpython-39\tokenizers\trainers creating build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\implementations\sentencepiece_unigram.py -> build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\implementations\init__.py -> build\lib.win-amd64-cpython-39\tokenizers\implementations copying py_src\tokenizers\init__.pyi -> build\lib.win-amd64-cpython-39\tokenizers copying py_src\tokenizers\models\init__.pyi -> build\lib.win-amd64-cpython-39\tokenizers\models copying py_src\tokenizers\decoders\init__.pyi -> build\lib.win-amd64-cpython-39\tokenizers\decoders copying py_src\tokenizers\normalizers\init.pyi -> build\lib.win-amd64-cpython-39\tokenizers\normalizers copying py_src\tokenizers\pre_tokenizers\init.pyi -> build\lib.win-amd64-cpython-39\tokenizers\pre_tokenizers copying py_src\tokenizers\processors\init__.pyi -> build\lib.win-amd64-cpython-39\tokenizers\processors ly, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain. [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for tokenizers Failed to build sentencepiece tokenizers ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

antongolubev5 commented 1 year ago

@thak123 ?

thak123 commented 1 year ago

I had similar install issue few hours back but I fixed it by editing the setup cfg file.

antongolubev5 commented 1 year ago

@thak123 can you explain what did you change in the setup cfg file?

CodeAKrome commented 1 year ago

This is very sensitive to python version. I used conda to get around install issues and I know 3.9 doesn't work, I tried it. :) conda create -n py37 python=3.7 conda activate py37