Closed ChristianF88 closed 7 years ago
error is related to numpy and is fixed if you rollback your version of numpy. This worked for me (after installing pyhum):
sudo pip install numpy==1.11.3
Hi there, thanks for your advice. Unfortunately I encounter a different Error now. I didn't use the sudo command, since I'm running on windows by the way.
The new error is:
(PyHumm) C:\Users\Sieglinde>python -c "import PyHum; PyHum.dotest()"
RuntimeError: module compiled against API version 0xb but this version of numpy is 0xa
Traceback (most recent call last):
File "
What version of Python are you using?
You may need to reinstall scipy. If you are on windows, you could try this:
http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy
(download, then ''pip install name_of_file.whl'')
sys.version: 2.7.13 | packaged by conda-forge | (default, May 2 2017, 13:28:48) [MSC v.1500 64 bit (AMD64)]
I did reinstall Scipy and now I get the following error:
(PyHumm) C:\Users\Sieglinde\Downloads>python -c"import PyHum;PyHum.dotest()"
Traceback (most recent call last):
File "
I can't uninstall MKL with pip and when I try to do it wich conda it wants to uninstall a bunch of stuff. I'm trying to reinstall the hole lot on a virtual machine. I got no luck yet, but I'll keep you posted.
I did follow your instructions exactly when trying to install PyHum on the VM but I can't get it installed. The pip install PyHum command returns with:
Command "C:\Anaconda2\python.exe -u -c "import setuptools, tokenize;file='c:\users\python2\appdata\local\temp\pip-build-ravfc7\pykdtree\setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record c:\users\python2\appdata\local\temp\pip-ugf_qa-record\install-record.txt --single-version-externally-managed --compile" failed with error code 1 in c:\users\python2\appdata\local\temp\pip-build-ravfc7\pykdtree\
I believe pykdtree still does not work on Windows. Looks like all other dependencies installed correctly, so could you try
pip install PyHum --no-deps
and see if that works?
Not sure is the anaconda version of numpy ships with MKL (doesn't look like it), so you could try downloading this one (assuming you are using python 2.7 on 64 bit)
http://www.silx.org/pub/wheelhouse/numpy-1.13.0+mkl-cp27-cp27m-win_amd64.whl
I have had all of these problems before.
I solved the numpy MKL problem by uninstalling and reinstalling whl files from http://www.lfd.uci.edu/~gohlke/pythonlibs/ . I am pretty sure numpy should be installed before scipy.
The pykdtree problem arises because the most recent version of pyresample (1.1.6) doesn't play nicely with scipy. A workaround could be to downgrade pyresample to version 1.1.4 before you the test again.
pip install pyresample==1.1.4
https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=icon Virus-free. www.avast.com https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=link <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
On Wed, Aug 16, 2017 at 7:54 PM, Daniel Buscombe notifications@github.com wrote:
Not sure is the anaconda version of numpy ships with MKL (doesn't look like it), so you could try downloading this one (assuming you are using python 2.7 on 64 bit)
http://www.silx.org/pub/wheelhouse/numpy-1.13.0+mkl- cp27-cp27m-win_amd64.whl
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/dbuscombe-usgs/PyHum/issues/45#issuecomment-322930915, or mute the thread https://github.com/notifications/unsubscribe-auth/AK_TUr68uqM2vFAMseFvpulPdsaBG9Kfks5sY4ExgaJpZM4OigY6 .
Well, I tried what you recommended. Unfortunately I did come up with the initial error message in the end. Just to make it reproducable I did create a new anaconda env. here is the summary of all my installation steps:
conda create --name Sonar python=2 -y activate Sonar conda install basemap -y conda install -c conda-forge basemap-data-hires -y (otherwise I get an epsg error) pip install simplekml pip uninstall numpy -y pip install numpy-1.13.0+mkl-cp27-cp27m-win_amd64.whl pip install scipy-0.19.1-cp27-cp27m-win_amd64.whl pip install scikit_image-0.13.0-cp27-cp27m-win_amd64.whl pip install sklearn pip install pandas pip install dask pip install joblib pip install toolz pip install cython pip install PyHum --no-deps
python -c"import PyHum;PyHum.dotest()"
Output:
Directory not copied. Error: [Error 183] Eine Datei kann nicht erstellt werden, wenn sie bereits vorhanden ist: 'C:\Users\Sieglinde\pyhum_test'
Input file is C:\Users\Sieglinde\pyhum_test\test.DAT
Son files are in C:\Users\Sieglinde\pyhum_test
cs2cs arguments are epsg:26949
Draft: 0.3
Celerity of sound: 1450.0 m/s
Port and starboard will be flipped
Transducer length is 0.108 m
Bed picking is auto
Chunks based on distance of 100 m
Data is from the 998 series
Bearing will be calculated from coordinates
Bearing will be filtered
Checking the epsg code you have chosen for compatibility with Basemap ...
... epsg code compatible
WARNING: Because files have to be read in byte by byte,
this could take a very long time ...
port sonar data will be parsed into 3.0, 99 m chunks
starboard sonar data will be parsed into 3.0, 99 m chunks
memory-mapping failed in sliding window - trying memory intensive version
low-freq. sonar data will be parsed into 3.0, 99 m chunks
high-freq. sonar data will be parsed into 3.0, 99 m chunks
memory-mapping failed in sliding window - trying memory intensive version
Traceback (most recent call last):
File "
Any other ideas?
Regarding the virtual machine, when I installed pyresample==1.1.4 und PyHum without dependencies. I got a bit further but still got an error in the end.
python -c"import PyHum;PyHum.dotest()"
Input file is C:\Users\Python2\pyhum_test\test.DAT
Son files are in C:\Users\Python2\pyhum_test
cs2cs arguments are epsg:26949
Draft: 0.3
Celerity of sound: 1450.0 m/s
Port and starboard will be flipped
Transducer length is 0.108 m
Bed picking is auto
Chunks based on distance of 100 m
Data is from the 998 series
Bearing will be calculated from coordinates
Bearing will be filtered
Checking the epsg code you have chosen for compatibility with Basemap ...
... epsg code compatible
WARNING: Because files have to be read in byte by byte,
this could take a very long time ...
port sonar data will be parsed into 3.0, 99 m chunks
starboard sonar data will be parsed into 3.0, 99 m chunks
memory-mapping failed in sliding window - trying memory intensive version
low-freq. sonar data will be parsed into 3.0, 99 m chunks
high-freq. sonar data will be parsed into 3.0, 99 m chunks
memory-mapping failed in sliding window - trying memory intensive version
Processing took 38.9127845073 seconds to analyse
Done!
Input file is C:\Users\Python2\pyhum_test\test.DAT
Sonar file path is C:\Users\Python2\pyhum_test
Max. transducer power is 1000.0 W
pH is 7.0
Temperature is 10.0
Traceback (most recent call last):
File "
Here is a list of the installed modules if that helps:
(C:\Anaconda2) C:\Users\Python2>conda list
#
_license 1.1 py27_1
alabaster 0.7.10 py27_0
anaconda custom py27_0
anaconda-client 1.6.3 py27_0
anaconda-navigator 1.6.2 py27_0
anaconda-project 0.6.0 py27_0
asn1crypto 0.22.0 py27_0
astroid 1.4.9 py27_0
astropy 1.3.2 np112py27_0
babel 2.4.0 py27_0
backports 1.0 py27_0
backports_abc 0.5 py27_0
basemap 1.0.7 np112py27_0
beautifulsoup4 4.6.0 py27_0
bitarray 0.8.1 py27_1
blaze 0.10.1 py27_0
bleach 1.5.0 py27_0
bokeh 0.12.5 py27_1
boto 2.46.1 py27_0
bottleneck 1.2.1 np112py27_0
bzip2 1.0.6 vc9_3 [vc9]
cdecimal 2.3 py27_2
cffi 1.10.0 py27_0
chardet 3.0.3 py27_0
click 6.7 py27_0
cloudpickle 0.2.2 py27_0
clyent 1.2.2 py27_0
colorama 0.3.9 py27_0
comtypes 1.1.2 py27_0
conda 4.3.24 py27_0
conda-env 2.6.0 0
configobj 5.0.6
So!!! Finally I got some progress:
With the following installation I get this:
conda create --name Sonar2 python=2 -y activate Sonar2 conda install basemap -y conda install -c conda-forge basemap-data-hires -y pip install simplekml conda install scipy -y conda install numpy==1.11.3 -y pip install sklearn pip install pandas pip install dask pip install joblib pip install toolz pip install cython conda install scikit-image -y pip install pyresample==1.1.4 pip install PyHum --no-deps python -c"import PyHum;PyHum.dotest()"
Directory not copied. Error: [Error 183] Eine Datei kann nicht erstellt werden, wenn sie bereits vorhanden ist: 'C:\Users\Sieglinde\pyhum_test'
Input file is C:\Users\Sieglinde\pyhum_test\test.DAT
Son files are in C:\Users\Sieglinde\pyhum_test
cs2cs arguments are epsg:26949
Draft: 0.3
Celerity of sound: 1450.0 m/s
Port and starboard will be flipped
Transducer length is 0.108 m
Bed picking is auto
Chunks based on distance of 100 m
Data is from the 998 series
Bearing will be calculated from coordinates
Bearing will be filtered
Checking the epsg code you have chosen for compatibility with Basemap ...
... epsg code compatible
WARNING: Because files have to be read in byte by byte,
this could take a very long time ...
port sonar data will be parsed into 3.0, 99 m chunks
starboard sonar data will be parsed into 3.0, 99 m chunks
memory-mapping failed in sliding window - trying memory intensive version
low-freq. sonar data will be parsed into 3.0, 99 m chunks
high-freq. sonar data will be parsed into 3.0, 99 m chunks
memory-mapping failed in sliding window - trying memory intensive version
Processing took 46.8383171255 seconds to analyse
Done!
Input file is C:\Users\Sieglinde\pyhum_test\test.DAT
Sonar file path is C:\Users\Sieglinde\pyhum_test
Max. transducer power is 1000.0 W
pH is 7.0
Temperature is 10.0
Traceback (most recent call last):
File "
Soooo, I think I fixed it. When I simply commented out the lines with the - plt.axis('normal'); plt.axis('tight') - command. I'm not sure how important this line is... Can it stay commented out?
Thanks guys!
Nice work!
Those lines shouldn't be of any consequence. Those just adjust the output figures extents. You might want to experiment with different matplotlib backends, if you intend to use those plots for anything.
D
On Thu, Aug 17, 2017 at 12:17 PM, ChristianF88 notifications@github.com wrote:
Soooo, I think I fixed it. When I simply commented out the lines with the - plt.axis('normal'); plt.axis('tight') - command. I'm not sure how important this line is... Can it stay commented out?
Thanks guys!
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/dbuscombe-usgs/PyHum/issues/45#issuecomment-323122066, or mute the thread https://github.com/notifications/unsubscribe-auth/AK_TUtdEjsxuPFzNzVoiAZ3d2_3KKYjiks5sZGezgaJpZM4OigY6 .
closed #45
Hi,
I'm using win 10 and I am running PyHum via Anaconda and a Spyder IDE. When I import PyHum I get this warning:
C:\Users\Sieglinde\Anaconda2\envs\PyHumm\lib\site-packages\matplotlib__init__.py:1405: UserWarning: This call to matplotlib.use() has no effect because the backend has already been chosen; matplotlib.use() must be called before pylab, matplotlib.pyplot, or matplotlib.backends is imported for the first time.
warnings.warn(_use_error_msg)
When I execute PyHum.dotest() I get the following:
Directory not copied. Error: [Error 183] Eine Datei kann nicht erstellt werden, wenn sie bereits vorhanden ist: 'C:\Users\Sieglinde\pyhum_test' Input file is C:\Users\Sieglinde\pyhum_test\test.DAT Son files are in C:\Users\Sieglinde\pyhum_test cs2cs arguments are epsg:26949 Draft: 0.3 Celerity of sound: 1450.0 m/s Port and starboard will be flipped Transducer length is 0.108 m Bed picking is auto Chunks based on distance of 100 m Data is from the 998 series Bearing will be calculated from coordinates Bearing will be filtered Checking the epsg code you have chosen for compatibility with Basemap ... epsg code compatible WARNING: Because files have to be read in byte by byte, this could take a very long time ... port sonar data will be parsed into 3.0, 99 m chunks starboard sonar data will be parsed into 3.0, 99 m chunks memory-mapping failed in sliding window - trying memory intensive version low-freq. sonar data will be parsed into 3.0, 99 m chunks high-freq. sonar data will be parsed into 3.0, 99 m chunks memory-mapping failed in sliding window - trying memory intensive version Traceback (most recent call last):
File "", line 1, in
PyHum.dotest()
File "C:\Users\Sieglinde\Anaconda2\envs\PyHumm\lib\site-packages\PyHum\test.py", line 131, in dotest PyHum.read(humfile, sonpath, cs2cs_args, c, draft, doplot, t, bedpick, flip_lr, model, calc_bearing, filt_bearing, chunk) #cog
File "C:\Users\Sieglinde\Anaconda2\envs\PyHumm\lib\site-packages\PyHum_pyhum_read.py", line 667, in read x, bed = humutils.auto_bedpick(ft, dep_m, chunkmode, port_fp, c)
File "C:\Users\Sieglinde\Anaconda2\envs\PyHumm\lib\site-packages\PyHum\utils.py", line 79, in auto_bedpick imu.append(port_fp[k][np.max([0,int(np.min(bed))-buff]):int(np.max(bed))+buff,:])
File "C:\Users\Sieglinde\Anaconda2\envs\PyHumm\lib\site-packages\numpy\core\memmap.py", line 335, in getitem res = super(memmap, self).getitem(index)
TypeError: slice indices must be integers or None or have an index method
Thanks a lot for your help.
Have a good one! Christian