Closed BingyiWu1990 closed 3 years ago
That's strange. Are you sure Colab is picking up the right dependencies and the Theano as backend?
An update on my side: I then tried to locate the requirements, it works but still returns similar incompatible errors.
import os;
os.chdir('/content/twitter-emotion-recognition')
!python3 -m pip install -r requirements.txt
After install --upgrade setuptools, the following error is shown:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
datascience 0.10.6 requires folium==0.2.1, but you have folium 0.8.3 which is incompatible.
Then, after collecting all the packages, the following errors are shown:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
xarray 0.18.0 requires numpy>=1.17, but you have numpy 1.16.0 which is incompatible.
xarray 0.18.0 requires pandas>=1.0, but you have pandas 0.24.1 which is incompatible.
textgenrnn 1.4.1 requires keras>=2.1.5, but you have keras 1.1.0 which is incompatible.
tensorflow 2.4.1 requires h5py~=2.10.0, but you have h5py 2.9.0 which is incompatible.
tensorflow 2.4.1 requires numpy~=1.19.2, but you have numpy 1.16.0 which is incompatible.
tensorflow 2.4.1 requires six~=1.15.0, but you have six 1.12.0 which is incompatible.
pyarrow 3.0.0 requires numpy>=1.16.6, but you have numpy 1.16.0 which is incompatible.
plotnine 0.6.0 requires pandas>=0.25.0, but you have pandas 0.24.1 which is incompatible.
mizani 0.6.0 requires pandas>=0.25.0, but you have pandas 0.24.1 which is incompatible.
kapre 0.1.3.1 requires keras>=2.0.0, but you have keras 1.1.0 which is incompatible.
google-colab 1.0.0 requires pandas~=1.1.0; python_version >= "3.0", but you have pandas 0.24.1 which is incompatible.
google-colab 1.0.0 requires six~=1.15.0, but you have six 1.12.0 which is incompatible.
google-api-python-client 1.12.8 requires six<2dev,>=1.13.0, but you have six 1.12.0 which is incompatible.
google-api-core 1.26.3 requires six>=1.13.0, but you have six 1.12.0 which is incompatible.
fbprophet 0.7.1 requires pandas>=1.0.4, but you have pandas 0.24.1 which is incompatible.
fancyimpute 0.4.3 requires keras>=2.0.0, but you have keras 1.1.0 which is incompatible.
datascience 0.10.6 requires folium==0.2.1, but you have folium 0.8.3 which is incompatible.
astropy 4.2.1 requires numpy>=1.17, but you have numpy 1.16.0 which is incompatible.
albumentations 0.1.12 requires imgaug<0.2.7,>=0.2.5, but you have imgaug 0.2.9 which is incompatible.
Maybe this helps #22 (comment)? Thank you for your prompt reply, Niki (I deleted my previous reply and realized I also deleted my thanks lol)
Another small update on my end:
For the first error I mentioned above, I include the reinstall the folium and eliminated that error:
!pip install folium==0.2.1
But for other errors, I don't know what I should do. Can I install the newer versions of the packages as indicated in the error messages?
Hmmm. Some things I noticed:
datascience 0.10.6 requires folium==0.2.1, but you have folium 0.8.3 which is incompatible.
I'm not sure what datascience 0.10.6
dependency is or who requires it, but I'm pretty sure it is not required by twitter-emotion-recognition
.
Similarly,
tensorflow 2.4.1 requires h5py~=2.10.0, but you have h5py 2.9.0 which is incompatible.
tensorflow 2.4.1 requires numpy~=1.19.2, but you have numpy 1.16.0 which is incompatible.
tensorflow 2.4.1 requires six~=1.15.0, but you have six 1.12.0 which is incompatible.
tensorflow
is not required by twitter-emotion-recognition
.
Perhaps you first try locally on your computer with a new empty pip virtual environment and then figure it out on colab?
Hmmm. Some things I noticed:
datascience 0.10.6 requires folium==0.2.1, but you have folium 0.8.3 which is incompatible.
I'm not sure what
datascience 0.10.6
dependency is or who requires it, but I'm pretty sure it is not required bytwitter-emotion-recognition
.Similarly,
tensorflow 2.4.1 requires h5py~=2.10.0, but you have h5py 2.9.0 which is incompatible. tensorflow 2.4.1 requires numpy~=1.19.2, but you have numpy 1.16.0 which is incompatible. tensorflow 2.4.1 requires six~=1.15.0, but you have six 1.12.0 which is incompatible.
tensorflow
is not required bytwitter-emotion-recognition
.Perhaps you first try locally on your computer with a new empty pip virtual environment and then figure it out on colab?
Can I use the same installation codes for my local environment in Jupyter Lab?
Can I use the same installation codes for my local environment in Jupyter Lab?
I don't see why not.
Can I use the same installation codes for my local environment in Jupyter Lab?
I don't see why not.
Hi Niki, I think I might need more helps for installing the packages in my local environment. I have been experimenting in Jupyter Lab under Windows 10 system. However, I keep receiving the following warnings and errors, which I don't know what I did wrong. Here are all the codes I am using:
!py -m pip install --user virtualenv
!mkdir emotion
!cd emotion
!py -m venv emotion
!.\emotion\Scripts\activate
!git clone https://github.com/nikicc/twitter-emotion-recognition.git
!cd twitter-emotion-recognition
!py -m pip install --upgrade pip
!py -m pip install wheel
!py -m pip install --upgrade setuptools
!py -m pip install h5py==2.9.0
!py -m pip install Keras==1.1.0
!py -m pip install numpy==1.16.0
!py -m pip install pandas==0.24.1
!py -m pip install python-dateutil==2.8.0
!py -m pip install pytz==2018.9
!py -m pip install PyYAML==5.1
!py -m pip install scipy==1.2.1
!py -m pip install six==1.12.0
!py -m pip install Theano==1.0.4
Here are the outputs I got after trying to install the required packages:
Collecting h5py==2.9.0
Using cached h5py-2.9.0.tar.gz (287 kB)
Requirement already satisfied: numpy>=1.7 in c:\users\bingy\anaconda3\lib\site-packages (from h5py==2.9.0) (1.20.1)
Requirement already satisfied: six in c:\users\bingy\anaconda3\lib\site-packages (from h5py==2.9.0) (1.15.0)
Building wheels for collected packages: h5py
Building wheel for h5py (setup.py): started
Building wheel for h5py (setup.py): finished with status 'error'
Running setup.py clean for h5py
Failed to build h5py
Installing collected packages: h5py
Attempting uninstall: h5py
Found existing installation: h5py 2.10.0
Uninstalling h5py-2.10.0:
Successfully uninstalled h5py-2.10.0
Running setup.py install for h5py: started
Running setup.py install for h5py: finished with status 'error'
Rolling back uninstall of h5py
Moving to c:\users\bingy\anaconda3\lib\site-packages\h5py-2.10.0.dist-info\
from C:\Users\bingy\anaconda3\Lib\site-packages\~5py-2.10.0.dist-info
Moving to c:\users\bingy\anaconda3\lib\site-packages\h5py\
from C:\Users\bingy\anaconda3\Lib\site-packages\~5py
ERROR: Command errored out with exit status 1:
command: 'C:\Users\bingy\anaconda3\python.exe' -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\bingy\\AppData\\Local\\Temp\\pip-install-cf8xkhwg\\h5py_5401da9c3833464186b7a7b09b1e95a4\\setup.py'"'"'; __file__='"'"'C:\\Users\\bingy\\AppData\\Local\\Temp\\pip-install-cf8xkhwg\\h5py_5401da9c3833464186b7a7b09b1e95a4\\setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'C:\Users\bingy\AppData\Local\Temp\pip-wheel-94ge6ie6'
cwd: C:\Users\bingy\AppData\Local\Temp\pip-install-cf8xkhwg\h5py_5401da9c3833464186b7a7b09b1e95a4\
Complete output (1316 lines):
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-3.8
creating build\lib.win-amd64-3.8\h5py
copying h5py\h5py_warnings.py -> build\lib.win-amd64-3.8\h5py
copying h5py\highlevel.py -> build\lib.win-amd64-3.8\h5py
copying h5py\ipy_completer.py -> build\lib.win-amd64-3.8\h5py
copying h5py\version.py -> build\lib.win-amd64-3.8\h5py
copying h5py\__init__.py -> build\lib.win-amd64-3.8\h5py
creating build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\attrs.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\base.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\compat.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\dataset.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\datatype.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\dims.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\files.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\filters.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\group.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\selections.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\selections2.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\vds.py -> build\lib.win-amd64-3.8\h5py\_hl
copying h5py\_hl\__init__.py -> build\lib.win-amd64-3.8\h5py\_hl
creating build\lib.win-amd64-3.8\h5py\tests
copying h5py\tests\common.py -> build\lib.win-amd64-3.8\h5py\tests
copying h5py\tests\__init__.py -> build\lib.win-amd64-3.8\h5py\tests
creating build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_attrs.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_attrs_data.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_base.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_dataset.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_datatype.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_dimension_scales.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_file.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_file_image.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_group.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_h5.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_h5d_direct_chunk_write.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_h5f.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_h5p.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_h5t.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_objects.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_selections.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\test_slicing.py -> build\lib.win-amd64-3.8\h5py\tests\old
copying h5py\tests\old\__init__.py -> build\lib.win-amd64-3.8\h5py\tests\old
creating build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_attribute_create.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_dataset_getitem.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_dataset_swmr.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_datatype.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_deprecation.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_dims_dimensionproxy.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_file.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_filters.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\test_threads.py -> build\lib.win-amd64-3.8\h5py\tests\hl
copying h5py\tests\hl\__init__.py -> build\lib.win-amd64-3.8\h5py\tests\hl
creating build\lib.win-amd64-3.8\h5py\tests\hl\test_vds
copying h5py\tests\hl\test_vds\test_highlevel_vds.py -> build\lib.win-amd64-3.8\h5py\tests\hl\test_vds
copying h5py\tests\hl\test_vds\test_lowlevel_vds.py -> build\lib.win-amd64-3.8\h5py\tests\hl\test_vds
copying h5py\tests\hl\test_vds\test_virtual_source.py -> build\lib.win-amd64-3.8\h5py\tests\hl\test_vds
copying h5py\tests\hl\test_vds\__init__.py -> build\lib.win-amd64-3.8\h5py\tests\hl\test_vds
running build_ext
Autodetection skipped [Could not find module 'libhdf5.so' (or one of its dependencies). Try using the full path with constructor syntax.]
C:\Users\bingy\anaconda3\lib\site-packages\Cython\Compiler\Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: C:\Users\bingy\AppData\Local\Temp\pip-install-cf8xkhwg\h5py_5401da9c3833464186b7a7b09b1e95a4\h5py\_conv.pxd
tree = Parsing.p_module(s, pxd, full_module_name)
warning: h5py\api_types_hdf5.pxd:408:2: 'H5D_layout_t' redeclared
warning: h5py\api_types_hdf5.pxd:415:2: 'H5D_alloc_time_t' redeclared
warning: h5py\api_types_hdf5.pxd:422:2: 'H5D_space_status_t' redeclared
warning: h5py\api_types_hdf5.pxd:428:2: 'H5D_fill_time_t' redeclared
warning: h5py\api_types_hdf5.pxd:434:2: 'H5D_fill_value_t' redeclared
warning: h5py\api_types_hdf5.pxd:446:7: 'H5F_close_degree_t' redeclared
....
Sorry for the massive scripts. I wonder where I did incorrectly? I am not familiar with the anaconda. It usually drives me crazy lol Thank you in advance!
I accidentally closed the issue. Sorry about that! Another update from me.
Then I tried to install packages without specifying the versions, it seems to work for me now.
!pip install Keras
!pip install numpy
!pip install pandas
!pip install python-dateutil
!pip install pytz
!pip install PyYAML
!pip install scipy
!pip install six
!pip install Theano
But I got the following warning:
Using Theano backend.
WARNING (theano.configdefaults): g++ not available, if using conda: `conda install m2w64-toolchain`
C:\Users\bingy\anaconda3\lib\site-packages\theano\configdefaults.py:560: UserWarning: DeprecationWarning: there is no c++ compiler.This is deprecated and with Theano 0.11 a c++ compiler will be mandatory
warnings.warn("DeprecationWarning: there is no c++ compiler."
WARNING (theano.configdefaults): g++ not detected ! Theano will be unable to execute optimized C-implementations (for both CPU and GPU) and will default to Python implementations. Performance will be severely degraded. To remove this warning, set Theano flags cxx to an empty string.
WARNING (theano.tensor.blas): Using NumPy C-API based implementation for BLAS functions.
Is that gonna be ok to use the model?
Installing collected packages: h5py Attempting uninstall: h5py Found existing installation: h5py 2.10.0 Uninstalling h5py-2.10.0: Successfully uninstalled h5py-2.10.0 Running setup.py install for h5py: started Running setup.py install for h5py: finished with status 'error' Rolling back uninstall of h5py Moving to c:\users\bingy\anaconda3\lib\site-packages\h5py-2.10.0.dist-info\ from C:\Users\bingy\anaconda3\Lib\site-packages\~5py-2.10.0.dist-info Moving to c:\users\bingy\anaconda3\lib\site-packages\h5py\ from C:\Users\bingy\anaconda3\Lib\site-packages\~5py ERROR: Command errored out with exit status 1:
Seems like there is some issue with installing h5py==2.10.0
. Not sure what is wrong. Maybe try with a different version of h5py
?
Then I tried to install packages without specifying the versions, it seems to work for me now.
Not sure that will work, as there might be issues with the latest version of Keras. You can try though, but I wouldn't keep my hopes up.
Is that gonna be ok to use the model?
I'm not sure it will. I suggest you try to keep at least the Keras version the same as specified in the dependencies file.
WARNING (theano.configdefaults): g++ not available, if using conda:
conda install m2w64-toolchain
Did you try conda install m2w64-toolchain
?
Installing collected packages: h5py Attempting uninstall: h5py Found existing installation: h5py 2.10.0 Uninstalling h5py-2.10.0: Successfully uninstalled h5py-2.10.0 Running setup.py install for h5py: started Running setup.py install for h5py: finished with status 'error' Rolling back uninstall of h5py Moving to c:\users\bingy\anaconda3\lib\site-packages\h5py-2.10.0.dist-info from C:\Users\bingy\anaconda3\Lib\site-packages~5py-2.10.0.dist-info Moving to c:\users\bingy\anaconda3\lib\site-packages\h5py from C:\Users\bingy\anaconda3\Lib\site-packages~5py ERROR: Command errored out with exit status 1:
Seems like there is some issue with installing
h5py==2.10.0
. Not sure what is wrong. Maybe try with a different version ofh5py
?Then I tried to install packages without specifying the versions, it seems to work for me now.
Not sure that will work, as there might be issues with the latest version of Keras. You can try though, but I wouldn't keep my hopes up.
Is that gonna be ok to use the model?
I'm not sure it will. I suggest you try to keep at least the Keras version the same as specified in the dependencies file.
WARNING (theano.configdefaults): g++ not available, if using conda:
conda install m2w64-toolchain
Did you try
conda install m2w64-toolchain
?
Hi Niki,
Thank you so much for your replies.
Not sure that will work, as there might be issues with the latest version of Keras. You can try though, but I wouldn't keep my hopes up.
I think that I installed the required versions somehow even with a bunch of warnings and errors received. That is why it is working when I installed them again without specifying versions. It shows everything has been satisfied.
Did you try
conda install m2w64-toolchain
I installed m2w64-toolchain later, and the warning is gone (YEAH)! For speeding up the computation, I installed theano.gpuarray
in conda. It helps me process 200k within 6 hours.
Thank you again for developing such cool models! Look forward to your future works!
@BingyiWu1990 happy to hear it works.
Dear Niki,
I was running the prediction fine until this afternoon. It starts returning me the following error after installing required environment and packages:
`!sudo apt-get install -y python3-venv !sudo apt-get install -y python3-dev !mkdir emotion !cd emotion !python3 -m venv . !source bin/activate !git clone https://github.com/nikicc/twitter-emotion-recognition.git !cd twitter-emotion-recognition !pip install --upgrade pip !pip install wheel !pip install --upgrade setuptools
!pip install h5py==2.9.0 !pip install Keras==1.1.0 !pip install numpy==1.16.0 !pip install pandas==0.24.1 !pip install python-dateutil==2.8.0 !pip install pytz==2018.9 !pip install PyYAML==5.4 !pip install scipy==1.2.1 !pip install six==1.12.0 !pip install Theano==1.0.4
import os; os.environ['KERAS_BACKEND'] = 'theano' os.chdir('/content/twitter-emotion-recognition') import pandas as pd from emotion_predictor import EmotionPredictor
pd.options.display.max_colwidth = 150 pd.options.display.width = 200
model = EmotionPredictor(classification='plutchik', setting='mc', use_unison_model=True)
TypeError Traceback (most recent call last)