flairNLP / flair

A very simple framework for state-of-the-art Natural Language Processing (NLP)
https://flairnlp.github.io/flair/
Other
13.94k stars 2.1k forks source link

[Bug]: Sentencepiece wheel issue preventing flair install #3129

Closed boydxh closed 1 year ago

boydxh commented 1 year ago

Describe the bug

I cannot download flair because of a legacy install failure with the package sentencepiece.

To Reproduce

cd flairtest
pipenv shell
pip3 install flair

Expected behaivor

Expected successful install of flair. If I bypass the sentencepiece wheel error by running "pip3 install flair --only-binary=sentencepiece" then flair will install but hardly any of the features (as described in the tutorials) work.

Logs and Stack traces

pip3 install flair                                                                                                   [22:46:28]
Collecting flair
  Using cached flair-0.11.3-py3-none-any.whl (401 kB)
Collecting tabulate
  Using cached tabulate-0.9.0-py3-none-any.whl (35 kB)
Collecting langdetect
  Using cached langdetect-1.0.9-py3-none-any.whl
Collecting gdown==4.4.0
  Using cached gdown-4.4.0-py3-none-any.whl
Collecting huggingface-hub
  Downloading huggingface_hub-0.12.1-py3-none-any.whl (190 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 190.3/190.3 kB 2.1 MB/s eta 0:00:00
Collecting matplotlib>=2.2.3
  Downloading matplotlib-3.7.0-cp310-cp310-macosx_11_0_arm64.whl (7.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.3/7.3 MB 35.0 MB/s eta 0:00:00
Collecting hyperopt>=0.2.7
  Using cached hyperopt-0.2.7-py2.py3-none-any.whl (1.6 MB)
Collecting ftfy
  Using cached ftfy-6.1.1-py3-none-any.whl (53 kB)
Collecting konoha<5.0.0,>=4.0.0
  Using cached konoha-4.6.5-py3-none-any.whl (20 kB)
Collecting scikit-learn>=0.21.3
  Downloading scikit_learn-1.2.1-cp310-cp310-macosx_12_0_arm64.whl (8.4 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.4/8.4 MB 61.9 MB/s eta 0:00:00
Collecting mpld3==0.3
  Using cached mpld3-0.3-py3-none-any.whl
Collecting segtok>=1.5.7
  Using cached segtok-1.5.11-py3-none-any.whl (24 kB)
Collecting gensim>=3.4.0
  Using cached gensim-4.3.0-cp310-cp310-macosx_10_9_universal2.whl (24.5 MB)
Collecting more-itertools
  Downloading more_itertools-9.1.0-py3-none-any.whl (54 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.2/54.2 kB 2.1 MB/s eta 0:00:00
Collecting wikipedia-api
  Using cached Wikipedia_API-0.5.8-py3-none-any.whl (13 kB)
Collecting deprecated>=1.2.4
  Using cached Deprecated-1.2.13-py2.py3-none-any.whl (9.6 kB)
Collecting torch!=1.8,>=1.5.0
  Using cached torch-1.13.1-cp310-none-macosx_11_0_arm64.whl (53.2 MB)
Collecting pptree
  Using cached pptree-3.1-py3-none-any.whl
Collecting conllu>=4.0
  Using cached conllu-4.5.2-py2.py3-none-any.whl (16 kB)
Collecting janome
  Using cached Janome-0.4.2-py2.py3-none-any.whl (19.7 MB)
Collecting lxml
  Using cached lxml-4.9.2-cp310-cp310-macosx_10_9_universal2.whl
Collecting regex
  Using cached regex-2022.10.31-cp310-cp310-macosx_11_0_arm64.whl (287 kB)
Collecting python-dateutil>=2.6.1
  Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
Requirement already satisfied: sqlitedict>=1.6.0 in /Users/Helena.Boyd/.local/share/virtualenvs/flairtest-ltTnab-O/lib/python3.10/site-packages (from flair) (2.1.0)
Collecting sentencepiece==0.1.95
  Using cached sentencepiece-0.1.95.tar.gz (508 kB)
  Preparing metadata (setup.py) ... done
Collecting transformers>=4.0.0
  Downloading transformers-4.26.1-py3-none-any.whl (6.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.3/6.3 MB 61.2 MB/s eta 0:00:00
Collecting tqdm>=4.26.0
  Downloading tqdm-4.65.0-py3-none-any.whl (77 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 77.1/77.1 kB 3.1 MB/s eta 0:00:00
Collecting bpemb>=0.3.2
  Using cached bpemb-0.3.4-py3-none-any.whl (19 kB)
Collecting six
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting filelock
  Using cached filelock-3.9.0-py3-none-any.whl (9.7 kB)
Collecting beautifulsoup4
  Downloading beautifulsoup4-4.11.2-py3-none-any.whl (129 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 129.4/129.4 kB 5.5 MB/s eta 0:00:00
Collecting requests[socks]
  Using cached requests-2.28.2-py3-none-any.whl (62 kB)
Collecting numpy
  Downloading numpy-1.24.2-cp310-cp310-macosx_11_0_arm64.whl (13.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.9/13.9 MB 55.0 MB/s eta 0:00:00
Collecting wrapt<2,>=1.10
  Downloading wrapt-1.15.0-cp310-cp310-macosx_11_0_arm64.whl (36 kB)
Collecting FuzzyTM>=0.4.0
  Using cached FuzzyTM-2.0.5-py3-none-any.whl (29 kB)
Collecting smart-open>=1.8.1
  Using cached smart_open-6.3.0-py3-none-any.whl (56 kB)
Collecting scipy>=1.7.0
  Downloading scipy-1.10.1-cp310-cp310-macosx_12_0_arm64.whl (28.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 28.8/28.8 MB 37.9 MB/s eta 0:00:00
Collecting networkx>=2.2
  Using cached networkx-3.0-py3-none-any.whl (2.0 MB)
Collecting cloudpickle
  Downloading cloudpickle-2.2.1-py3-none-any.whl (25 kB)
Collecting future
  Using cached future-0.18.3-py3-none-any.whl
Collecting py4j
  Using cached py4j-0.10.9.7-py2.py3-none-any.whl (200 kB)
Collecting overrides<4.0.0,>=3.0.0
  Using cached overrides-3.1.0-py3-none-any.whl
Collecting importlib-metadata<4.0.0,>=3.7.0
  Using cached importlib_metadata-3.10.1-py3-none-any.whl (14 kB)
Collecting cycler>=0.10
  Using cached cycler-0.11.0-py3-none-any.whl (6.4 kB)
Collecting packaging>=20.0
  Using cached packaging-23.0-py3-none-any.whl (42 kB)
Collecting contourpy>=1.0.1
  Using cached contourpy-1.0.7-cp310-cp310-macosx_11_0_arm64.whl (229 kB)
Collecting fonttools>=4.22.0
  Using cached fonttools-4.38.0-py3-none-any.whl (965 kB)
Collecting kiwisolver>=1.0.1
  Using cached kiwisolver-1.4.4-cp310-cp310-macosx_11_0_arm64.whl (63 kB)
Collecting pillow>=6.2.0
  Using cached Pillow-9.4.0-cp310-cp310-macosx_11_0_arm64.whl (3.0 MB)
Collecting pyparsing>=2.3.1
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting joblib>=1.1.1
  Using cached joblib-1.2.0-py3-none-any.whl (297 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting typing-extensions
  Downloading typing_extensions-4.5.0-py3-none-any.whl (27 kB)
Requirement already satisfied: tokenizers!=0.11.3,<0.14,>=0.11.1 in /Users/Helena.Boyd/.local/share/virtualenvs/flairtest-ltTnab-O/lib/python3.10/site-packages (from transformers>=4.0.0->flair) (0.13.2)
Collecting pyyaml>=5.1
  Using cached PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl (173 kB)
Requirement already satisfied: wcwidth>=0.2.5 in /Users/Helena.Boyd/.local/share/virtualenvs/flairtest-ltTnab-O/lib/python3.10/site-packages (from ftfy->flair) (0.2.6)
Collecting pyfume
  Using cached pyFUME-0.2.25-py3-none-any.whl (67 kB)
Collecting pandas
  Downloading pandas-1.5.3-cp310-cp310-macosx_11_0_arm64.whl (10.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.9/10.9 MB 31.1 MB/s eta 0:00:00
Collecting zipp>=0.5
  Downloading zipp-3.15.0-py3-none-any.whl (6.8 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp310-cp310-macosx_11_0_arm64.whl (122 kB)
Collecting soupsieve>1.2
  Downloading soupsieve-2.4-py3-none-any.whl (37 kB)
Collecting PySocks!=1.5.7,>=1.5.6
  Using cached PySocks-1.7.1-py3-none-any.whl (16 kB)
Collecting pytz>=2020.1
  Using cached pytz-2022.7.1-py2.py3-none-any.whl (499 kB)
Collecting simpful
  Downloading simpful-2.10.0-py3-none-any.whl (31 kB)
Collecting fst-pso
  Using cached fst_pso-1.8.1-py3-none-any.whl
Collecting miniful
  Using cached miniful-0.0.6-py3-none-any.whl
Building wheels for collected packages: sentencepiece
  Building wheel for sentencepiece (setup.py) ... error
  error: subprocess-exited-with-error

  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [152 lines of output]
      /Users/Helena.Boyd/.local/share/virtualenvs/flairtest-ltTnab-O/lib/python3.10/site-packages/setuptools/dist.py:770: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
        warnings.warn(
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.macosx-10.9-universal2-cpython-310
      creating build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      copying src/sentencepiece/__init__.py -> build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      copying src/sentencepiece/sentencepiece_model_pb2.py -> build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      copying src/sentencepiece/sentencepiece_pb2.py -> build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      warning: build_py: byte-compiling is disabled, skipping.

      running build_ext
      /bin/sh: pkg-config: command not found
      Cloning into 'sentencepiece'...
      Note: switching to '0e6dfbf86e2fa6d86a3d9a8a08a628da71c073e0'.

      You are in 'detached HEAD' state. You can look around, make experimental
      changes and commit them, and you can discard any commits you make in this
      state without impacting any branches by switching back to a branch.

      If you want to create a new branch to retain commits you create, you may
      do so (now or later) by using -c with the switch command. Example:

        git switch -c <new-branch-name>

      Or undo this operation with:

        git switch -

      Turn off this advice by setting config variable advice.detachedHead to false

      -- VERSION: 0.1.95
      -- The C compiler identification is AppleClang 13.0.0.13000027
      -- The CXX compiler identification is AppleClang 13.0.0.13000027
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
      -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
      -- Found Threads: TRUE
      -- Not Found TCMalloc: TCMALLOC_LIB-NOTFOUND
      -- Configuring done
      -- Generating done
      -- Build files have been written to: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/sentencepiece/build
      ./build_bundled.sh: line 16: nproc: command not found
      [  1%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/builder.cc.o
      [  3%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/arena.cc.o
      [  4%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/unicode_script.cc.o
      [  6%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/arenastring.cc.o
      [  7%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/bytestream.cc.o
      [  9%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/strutil.cc.o
      [ 11%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/coded_stream.cc.o
      [ 12%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/unigram_model_trainer.cc.o
      [ 14%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/common.cc.o
      [ 15%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/extension_set.cc.o
      [ 17%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/trainer_factory.cc.o
      [ 19%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/char_model_trainer.cc.o
      [ 20%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/word_model_trainer.cc.o
      [ 22%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/generated_message_util.cc.o
      [ 23%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/generated_enum_util.cc.o
      [ 25%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/generated_message_table_driven_lite.cc.o
      [ 26%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/error.cc.o
      [ 28%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/sentencepiece_trainer.cc.o
      [ 30%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/int128.cc.o
      [ 31%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/pretokenizer_for_training.cc.o
      [ 33%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/parse_context.cc.o
      [ 34%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/message_lite.cc.o
      [ 36%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/bpe_model_trainer.cc.o
      [ 38%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/repeated_field.cc.o
      [ 39%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/zero_copy_stream_impl.cc.o
      [ 41%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/statusor.cc.o
      [ 42%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/time.cc.o
      [ 44%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/io_win32.cc.o
      [ 46%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/stringprintf.cc.o
      [ 47%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/stringpiece.cc.o
      [ 49%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/status.cc.o
      [ 50%] Building CXX object src/CMakeFiles/sentencepiece_train-static.dir/trainer_interface.cc.o
      [ 52%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/structurally_valid.cc.o
      [ 53%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/char_model.cc.o
      [ 55%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/zero_copy_stream_impl_lite.cc.o
      [ 57%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/zero_copy_stream.cc.o
      [ 58%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/word_model.cc.o
      [ 60%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/implicit_weak_message.cc.o
      [ 61%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/protobuf-lite/wire_format_lite.cc.o
      [ 63%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/model_interface.cc.o
      [ 65%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/bpe_model.cc.o
      [ 66%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/builtin_pb/sentencepiece.pb.cc.o
      [ 68%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/builtin_pb/sentencepiece_model.pb.cc.o
      [ 69%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/sentencepiece_processor.cc.o
      [ 71%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/unigram_model.cc.o
      [ 73%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/normalizer.cc.o
      [ 74%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/util.cc.o
      [ 76%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/model_factory.cc.o
      [ 77%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/filesystem.cc.o
      [ 79%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/absl/strings/string_view.cc.o
      [ 80%] Building CXX object src/CMakeFiles/sentencepiece-static.dir/__/third_party/absl/flags/flag.cc.o
      /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/sentencepiece/src/builder.cc:47:15: warning: unused variable 'kMaxUnicode' [-Wunused-const-variable]
      constexpr int kMaxUnicode = 0x10FFFF;
                    ^
      /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/sentencepiece/src/builder.cc:49:23: warning: unused variable 'kDefaultNormalizerName' [-Wunused-const-variable]
      static constexpr char kDefaultNormalizerName[] = "nfkc";
                            ^
      2 warnings generated.
      [ 82%] Linking CXX static library libsentencepiece_train.a
      [ 84%] Linking CXX static library libsentencepiece.a
      [ 84%] Built target sentencepiece_train-static
      [ 84%] Built target sentencepiece-static
      [ 85%] Building CXX object src/CMakeFiles/spm_encode.dir/spm_encode_main.cc.o
      [ 87%] Building CXX object src/CMakeFiles/spm_normalize.dir/spm_normalize_main.cc.o
      [ 88%] Building CXX object src/CMakeFiles/spm_decode.dir/spm_decode_main.cc.o
      [ 90%] Building CXX object src/CMakeFiles/spm_export_vocab.dir/spm_export_vocab_main.cc.o
      [ 92%] Building CXX object src/CMakeFiles/spm_train.dir/spm_train_main.cc.o
      [ 93%] Linking CXX executable spm_export_vocab
      [ 93%] Built target spm_export_vocab
      [ 95%] Linking CXX executable spm_normalize
      [ 96%] Linking CXX executable spm_train
      [ 96%] Built target spm_normalize
      [ 96%] Built target spm_train
      [ 98%] Linking CXX executable spm_decode
      [ 98%] Built target spm_decode
      [100%] Linking CXX executable spm_encode
      [100%] Built target spm_encode
      [ 66%] Built target sentencepiece-static
      [ 84%] Built target sentencepiece_train-static
      [ 87%] Built target spm_encode
      [ 90%] Built target spm_decode
      [ 93%] Built target spm_normalize
      [ 96%] Built target spm_train
      [100%] Built target spm_export_vocab
      Install the project...
      -- Install configuration: ""
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/lib/pkgconfig/sentencepiece.pc
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/lib/libsentencepiece.a
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/lib/libsentencepiece_train.a
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_encode
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_decode
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_normalize
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_train
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_export_vocab
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/include/sentencepiece_trainer.h
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/include/sentencepiece_processor.h
      env: pkg-config: No such file or directory
      Failed to find sentencepiece pkg-config
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for sentencepiece
  Running setup.py clean for sentencepiece
Failed to build sentencepiece
Installing collected packages: sentencepiece, pytz, py4j, pptree, overrides, mpld3, janome, charset-normalizer, zipp, wrapt, urllib3, typing-extensions, tqdm, threadpoolctl, tabulate, soupsieve, smart-open, six, regex, pyyaml, PySocks, pyparsing, pillow, packaging, numpy, networkx, more-itertools, lxml, kiwisolver, joblib, idna, future, ftfy, fonttools, filelock, cycler, conllu, cloudpickle, certifi, torch, segtok, scipy, requests, python-dateutil, langdetect, importlib-metadata, deprecated, contourpy, beautifulsoup4, wikipedia-api, simpful, scikit-learn, pandas, miniful, matplotlib, konoha, hyperopt, huggingface-hub, transformers, gdown, fst-pso, pyfume, FuzzyTM, gensim, bpemb, flair
  Running setup.py install for sentencepiece ... error
  error: subprocess-exited-with-error

  × Running setup.py install for sentencepiece did not run successfully.
  │ exit code: 1
  ╰─> [55 lines of output]
      /Users/Helena.Boyd/.local/share/virtualenvs/flairtest-ltTnab-O/lib/python3.10/site-packages/setuptools/dist.py:770: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
        warnings.warn(
      running install
      /Users/Helena.Boyd/.local/share/virtualenvs/flairtest-ltTnab-O/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
        warnings.warn(
      running build
      running build_py
      creating build
      creating build/lib.macosx-10.9-universal2-cpython-310
      creating build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      copying src/sentencepiece/__init__.py -> build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      copying src/sentencepiece/sentencepiece_model_pb2.py -> build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      copying src/sentencepiece/sentencepiece_pb2.py -> build/lib.macosx-10.9-universal2-cpython-310/sentencepiece
      warning: build_py: byte-compiling is disabled, skipping.

      running build_ext
      /bin/sh: pkg-config: command not found
      mkdir: bundled: File exists
      fatal: destination path 'sentencepiece' already exists and is not an empty directory.
      fatal: destination path 'sentencepiece' already exists and is not an empty directory.
      mkdir: build: File exists
      -- VERSION: 0.1.95
      -- Not Found TCMalloc: TCMALLOC_LIB-NOTFOUND
      -- Configuring done
      -- Generating done
      -- Build files have been written to: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/sentencepiece/build
      ./build_bundled.sh: line 16: nproc: command not found
      [ 17%] Built target sentencepiece_train-static
      [ 84%] Built target sentencepiece-static
      [ 87%] Built target spm_encode
      [ 90%] Built target spm_decode
      [ 93%] Built target spm_normalize
      [ 96%] Built target spm_train
      [100%] Built target spm_export_vocab
      [ 66%] Built target sentencepiece-static
      [ 84%] Built target sentencepiece_train-static
      [ 87%] Built target spm_encode
      [ 90%] Built target spm_decode
      [ 93%] Built target spm_normalize
      [ 96%] Built target spm_train
      [100%] Built target spm_export_vocab
      Install the project...
      -- Install configuration: ""
      -- Up-to-date: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/lib/pkgconfig/sentencepiece.pc
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/lib/libsentencepiece.a
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/lib/libsentencepiece_train.a
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_encode
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_decode
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_normalize
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_train
      -- Installing: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/bin/spm_export_vocab
      -- Up-to-date: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/include/sentencepiece_trainer.h
      -- Up-to-date: /private/var/folders/c1/ggddkdqn54v1cpm8jl682ygr0000gq/T/pip-install-x1lgtehf/sentencepiece_55e71fbe13ac44c6bc59b9af03cc006d/bundled/include/sentencepiece_processor.h
      env: pkg-config: No such file or directory
      Failed to find sentencepiece pkg-config
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: legacy-install-failure

× Encountered error while trying to install package.
╰─> sentencepiece

note: This is an issue with the package mentioned above, not pip.
hint: See above for output from the failure.
(flairtest) FAIL

Screenshots

No response

Additional Context

No response

Environment

I cannot run collect_env.py because I cannot install flair.

helpmefindaname commented 1 year ago

Hi @boydxh , this is a known issue, see https://github.com/flairNLP/flair/issues/2969 https://github.com/flairNLP/flair/issues/2853 https://github.com/flairNLP/flair/issues/2833 https://github.com/flairNLP/flair/issues/2105

Currently you have the following options:

alanakbik commented 1 year ago

Hello @boydxh, Flair 0.12 is now released. Could you try another install to see if it works now?

Sajjad-Mahmoudi commented 1 year ago

Hi, I just tried pip install flair on Win11 (Anaconda Prompt). It didn't work: ERROR: Failed building wheel for tokenizers Failed to build gensim tokenizers ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

Btw, git installation didn't work as well

alanakbik commented 1 year ago

@Sajjad-Mahmoudi Flair 0.12 now requires at least Python 3.7. Could you share more details of your setup?

Sajjad-Mahmoudi commented 1 year ago

I could install it with Python 3.10. However, I have new issues (using PyCharm): 1) from flair.models import MultiTagger (Cannot find reference 'MultiTagger' in 'init.py' ) 2) When I run the command below: sentence = Sentence("Behavioral abnormalities in the Fmr1 KO2 Mouse Model of Fragile X Syndrome", use_tokenizer=SciSpacyTokenizer()), I got the error below: OSError: [E053] Could not read config file from C:\Users\mahmo\anaconda3\envs\hunFlair\lib\site-packages\en_core_sci_sm\en_core_sci_sm-0.2.5\config.cfg

alanakbik commented 1 year ago

@Sajjad-Mahmoudi the MultiTagger was absorbed into the new MultitaskModel. The bio tagger is now loaded like this:

from flair.data import Sentence
from flair.nn import Classifier

# make a sentence
sentence = Sentence('Behavioral abnormalities in the Fmr1 KO2 Mouse Model of Fragile X Syndrome.')

# load the NER tagger
tagger = Classifier.load('bioner')

# run NER over sentence
tagger.predict(sentence)

# print the sentence with all annotations
print(sentence)

See: https://github.com/flairNLP/flair/blob/master/resources/docs/TUTORIAL_TAGGING_NER.md#biomedical-data

boydxh commented 1 year ago

Hello @boydxh, Flair 0.12 is now released. Could you try another install to see if it works now?

Hi! Just tried again - it successfully installed without the error message, but when I tried the first tutorial I received an error message. So I tried running collect_env.py and got the exact same error message. Output below.

$ python3 collect_env.py [0:24:07] Traceback (most recent call last): File "/Users/Helena.Boyd/nlp/collect_env.py", line 4, in <module> import flair File "/Users/Helena.Boyd/.local/share/virtualenvs/nlp-j9jQos4W/lib/python3.10/site-packages/flair/__init__.py", line 19, in <module> from . import models File "/Users/Helena.Boyd/.local/share/virtualenvs/nlp-j9jQos4W/lib/python3.10/site-packages/flair/models/__init__.py", line 1, in <module> from .clustering import ClusteringModel File "/Users/Helena.Boyd/.local/share/virtualenvs/nlp-j9jQos4W/lib/python3.10/site-packages/flair/models/clustering.py", line 13, in <module> from flair.datasets import DataLoader File "/Users/Helena.Boyd/.local/share/virtualenvs/nlp-j9jQos4W/lib/python3.10/site-packages/flair/datasets/__init__.py", line 125, in <module> from .biomedical import ANAT_EM File "/Users/Helena.Boyd/.local/share/virtualenvs/nlp-j9jQos4W/lib/python3.10/site-packages/flair/datasets/biomedical.py", line 25, in <module> from flair.tokenization import ( ImportError: cannot import name 'SentenceSplitter' from 'flair.tokenization' (/Users/Helena.Boyd/.local/share/virtualenvs/nlp-j9jQos4W/lib/python3.10/site-packages/flair/tokenization.py) (nlp) FAIL

alanakbik commented 1 year ago

@boydxh thanks for the update! The output is strange since it looks like it is trying to load some classes from old locations. Could you paste the code that throws this error?

boydxh commented 1 year ago

@boydxh thanks for the update! The output is strange since it looks like it is trying to load some classes from old locations. Could you paste the code that throws this error?

Hi again, I tried to run setup.py which bumps the flair version:

`from setuptools import find_packages, setup

with open("requirements.txt") as f: required = f.read().splitlines()

setup( name="flair", version="0.12", description="A very simple framework for state-of-the-art NLP", long_description=open("README.md", encoding="utf-8").read(), long_description_content_type="text/markdown", author="Alan Akbik", author_email="alan.akbik@gmail.com", url="https://github.com/flairNLP/flair", packages=find_packages(exclude="tests"), # same as name license="MIT", install_requires=required, include_package_data=True, python_requires=">=3.7", )`