Closed tyoc213 closed 3 years ago
Thanks @tyoc213 for reporting this, are you running this on your local machine? Have you had a look at the notebooks mentioned here? Will these help in the meanwhile I take a look at this issue.
Can you try doing this on your CLI:
pip install en-core-web-sm @ https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.3.0/en_core_web_sm-2.3.0.tar.gz
and then
pip install -U nlp_profiler
My hunch is you have spacy==2.3.2
while the nlp_profiler'
requirements.txt
file is trying to install en_core_web_sm-2.3.0
- I will try to find another way to fix this
@tyoc213 I found an option for conda
/miniconda
users:
conda config --set pip_interop_enabled True # tested this on a different package and it worked
pip install -U nlp_profiler
Please let me know how this works for you.
Interop doesnt work, same error
$ pip install -U nlp_profiler
Collecting nlp_profiler
Using cached nlp_profiler-0.0.2-py2.py3-none-any.whl (39 kB)
Collecting swifter>=1.0.3
Using cached swifter-1.0.7.tar.gz (633 kB)
Collecting textblob>=0.15.3
Using cached textblob-0.15.3-py2.py3-none-any.whl (636 kB)
Requirement already satisfied, skipping upgrade: nltk>=3.5 in /home/tyoc213/miniconda3/envs/fastai/lib/python3.8/site-packages (from nlp_profiler) (3.5)
Requirement already satisfied, skipping upgrade: requests>=2.23.0 in /home/tyoc213/miniconda3/envs/fastai/lib/python3.8/site-packages (from nlp_profiler) (2.24.0)
Requirement already satisfied, skipping upgrade: tqdm>=4.46.0 in /home/tyoc213/miniconda3/envs/fastai/lib/python3.8/site-packages (from nlp_profiler) (4.48.2)
Collecting language-tool-python>=2.3.1
Using cached language_tool_python-2.4.7-py3-none-any.whl (30 kB)
Requirement already satisfied, skipping upgrade: joblib>=0.14.1 in /home/tyoc213/miniconda3/envs/fastai/lib/python3.8/site-packages (from nlp_profiler) (0.16.0)
ERROR: Could not find a version that satisfies the requirement en-core-web-sm (from nlp_profiler) (from versions: none)
ERROR: No matching distribution found for en-core-web-sm (from nlp_profiler)
But with python m spacy download en_core_web_sm
finished, so I could install later
Successfully built emoji swifter locket gpustat nvidia-ml-py3
Installing collected packages: language-tool-python, emoji, textblob, fsspec, toolz, locket, partd, dask, pyarrow, colorama, opencensus-context, pyasn1, pyasn1-modules, rsa, cachetools, google-auth, googleapis-common-protos, google-api-core, opencensus, nvidia-ml-py3, blessings, gpustat, async-timeout, multidict, yarl, aiohttp, aiohttp-cors, py-spy, hiredis, aioredis, colorful, redis, soupsieve, beautifulsoup4, google, msgpack, grpcio, ray, modin, swifter, nlp-profiler
Attempting uninstall: pyarrow
Found existing installation: pyarrow 2.0.0
Uninstalling pyarrow-2.0.0:
Successfully uninstalled pyarrow-2.0.0
ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.
We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.
modin 0.8.2 requires pandas==1.1.4, but you'll have pandas 1.1.1 which is incompatible.
Successfully installed aiohttp-3.7.3 aiohttp-cors-0.7.0 aioredis-1.3.1 async-timeout-3.0.1 beautifulsoup4-4.9.3 blessings-1.7 cachetools-4.2.0 colorama-0.4.4 colorful-0.5.4 dask-2020.12.0 emoji-0.6.0 fsspec-0.8.4 google-3.0.0 google-api-core-1.23.0 google-auth-1.24.0 googleapis-common-protos-1.52.0 gpustat-0.6.0 grpcio-1.34.0 hiredis-1.1.0 language-tool-python-2.4.7 locket-0.2.0 modin-0.8.2 msgpack-1.0.1 multidict-5.1.0 nlp-profiler-0.0.2 nvidia-ml-py3-7.352.0 opencensus-0.7.11 opencensus-context-0.1.2 partd-1.1.0 py-spy-0.3.3 pyarrow-1.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 ray-1.0.1.post1 redis-3.4.1 rsa-4.6 soupsieve-2.1 swifter-1.0.7 textblob-0.15.3 toolz-0.11.1 yarl-1.6.3
Thanks @tyoc213 I will update the README/Docs so conda users know what to do
You mean:
python -m spacy download en_core_web_sm
I can see now why installing is failing for you. (or other conda users) - although this has been taken care in general for conda users I might have to help this with the docs.
pip install nlp_profiler
shows thisTo Reproduce There is no dataframe to share because it cant be installed.
Version information:
Version information is essential in reproducing and resolving bugs. Please report:
Or with conda: