Closed cooperlab closed 8 years ago
Can you try this:
conda uninstall setuptools
pip uninstall setuptools
conda install setuptools==19.4
19.4 is the version travis is using
That didn't work. Version seems to be correct though:
import setuptools as st st.version '19.4'
@cooperlab I am trying to reproduce this error. Were you trying this on one of the nodes on the oppenheimer cluster?
I tried it on SMI-1 a couple days ago and I was able to install HistomicsTK without errors but when i did import histomicstk
it threw me a weird error with one of the imports import sklearn.cluster
in GaussianVoting
Below is a shell script that i wrote to install anaconda locally and then install HistomicsTK;
# Install anaconda
wget -O install_anaconda.sh https://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh
bash install_anaconda.sh -b -p ~/anaconda
export PATH=~/anaconda/bin:$PATH
rm install_anaconda.sh
# create conda environment
conda create --yes -n dsa python=2.7
source activate dsa
# git clone and install histomicstk
git clone https://github.com/DigitalSlideArchive/HistomicsTK.git
cd HistomicsTK
conda config --add channels https://conda.binstar.org/cdeepakroy
conda install --yes libgfortran==1.0 setuptools==19.4 ctk-cli==1.3.1 --file requirements_c_conda.txt
pip install -r requirements.txt -r requirements_c.txt
python setup.py build_ext --inplace
python setup.py install
python -c "import histomicstk"
The last statement python -c "import histomicstk"
results in the following error on the one of the compute nodes SMI-1:
Traceback (most recent call last): File "
", line 1, in File "histomicstk/init.py", line 7, in from . import segmentation # must be imported before features File "histomicstk/segmentation/init.py", line 17, in from . import nuclear File "histomicstk/segmentation/nuclear/init.py", line 6, in from .GaussianVoting import GaussianVoting File "histomicstk/segmentation/nuclear/GaussianVoting.py", line 3, in import sklearn.cluster as cl File "/opt/lib/python2.7/site-packages/sklearn/cluster/init.py", line 6, in from .spectral import spectral_clustering, SpectralClustering File "/opt/lib/python2.7/site-packages/sklearn/cluster/spectral.py", line 18, in from ..manifold import spectral_embedding File "/opt/lib/python2.7/site-packages/sklearn/manifold/init.py", line 6, in from .isomap import Isomap File "/opt/lib/python2.7/site-packages/sklearn/manifold/isomap.py", line 11, in from ..decomposition import KernelPCA File "/opt/lib/python2.7/site-packages/sklearn/decomposition/init.py", line 11, in from .sparse_pca import SparsePCA, MiniBatchSparsePCA File "/opt/lib/python2.7/site-packages/sklearn/decomposition/sparse_pca.py", line 9, in from ..linear_model import ridge_regression File "/opt/lib/python2.7/site-packages/sklearn/linear_model/init.py", line 17, in from .coordinate_descent import (Lasso, ElasticNet, LassoCV, ElasticNetCV, File "/opt/lib/python2.7/site-packages/sklearn/linear_model/coordinate_descent.py", line 29, in from . import cd_fast ImportError: /opt/lib/python2.7/site-packages/sklearn/linear_model/cd_fast.so: undefined symbol: PyFPE_jbuf
@cdeepakroy I was installing this in /opt on oppenheimer. Anything in that folder will be visible to all the nodes.
@cooperlab I just tried to do it on the anaconda in /opt/anaconda
from an oppenheimer node and it throws a few weird warnings when i create an environment or install something:
~$ conda create -n dsa python=2.7 Fetching package metadata ......... Solving package specifications: ..........
Package plan for installation in environment /home/dchitta/.conda/envs/dsa:
The following NEW packages will be INSTALLED:
openssl: 1.0.2j-0 pip: 8.1.2-py27_0 (soft-link) python: 2.7.12-1 (soft-link) readline: 6.2-2 (soft-link) setuptools: 27.2.0-py27_0 (soft-link) sqlite: 3.13.0-0 (soft-link) tk: 8.5.18-0 (soft-link) wheel: 0.29.0-py27_0 (soft-link) zlib: 1.2.8-3 (soft-link)
Proceed ([y]/n)? y
Linking packages ... WARNING conda.lock:touch(53): Failed to create lock, do not run conda in parallel processes [errno 13] WARNING conda.lock:touch(53): Failed to create lock, do not run conda in parallel processes [errno 13] WARNING conda.lock:touch(53): Failed to create lock, do not run conda in parallel processes [errno 13] | 22% WARNING conda.lock:touch(53): Failed to create lock, do not run conda in parallel processes [errno 13] WARNING conda.lock:touch(53): Failed to create lock, do not run conda in parallel processes [errno 13] WARNING conda.lock:touch(53): Failed to create lock, do not run conda in parallel processes [errno 13] WARNING conda.lock:touch(53): Failed to create lock, do not run conda in parallel processes [errno 13]
Do you also get these warnings? Or could it be because of my permissions?
@cooperlab Can you try this when you get a chance?
# make anaconda's python the default
export PATH=/opt/anaconda/bin:$PATH
# create conda environment
conda create --yes -n dsa python=2.7
source activate dsa
# update histomicstk to latest
cd /opt/histomicstk/HistomicsTK
git checkout master
git pull
# install HistomicsTK
conda config --add channels https://conda.binstar.org/cdeepakroy
conda install --yes libgfortran==1.0 setuptools==19.4 ctk-cli==1.3.1 --file requirements_c_conda.txt
pip install -r requirements.txt -r requirements_c.txt
python setup.py build_ext --inplace
python setup.py install
python -c "import histomicstk"
I don't seem to have permissions to install stuff to /opt/anaconda
It can't find openslide at import.
Let's get you sudo access to these machines (will email shortly). Also, we should be putting these packages into anaconda's site packages instead of directly into /opt/.
Closing this issue as the installation problems on oppenheimer have been resolved.
@cdeepakroy @dgutman RequirementParseError is not found in some versions of setuptools - I got the following error when trying to install Histomics on our systems:
Traceback (most recent call last): File "setup.py", line 14, in
from pkg_resources import parse_requirements, RequirementParseError
ImportError: cannot import name RequirementParseError
The problem remains after conda updating setuptools to the latest version.