jts / nanopolish

Signal-level algorithms for MinION data
MIT License
559 stars 159 forks source link

Makefile unpacking hiccup for 0.10.1 #444

Closed devonorourke closed 6 years ago

devonorourke commented 6 years ago

Hi Jared, I was struggling to get the make command to function properly for me when installing. I didn't have the HDF5 dependency installed on my machine, and the make command failed as follows:

if [ ! -e hdf5-1.8.14.tar.gz ]; then wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-1.8.14/src/hdf5-1.8.14.tar.gz; fi
--2018-08-20 15:07:30--  https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-1.8.14/src/hdf5-1.8.14.tar.gz
Resolving support.hdfgroup.org... 50.28.50.143
Connecting to support.hdfgroup.org|50.28.50.143|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [application/x-gzip]
Saving to: ‘hdf5-1.8.14.tar.gz’

hdf5-1.8.14.tar.gz                                              [                                              <=>                                                                                        ]  11.02M  1.24MB/s    in 9.8s    

2018-08-20 15:07:40 (1.12 MB/s) - ‘hdf5-1.8.14.tar.gz’ saved [11558346]

tar -xzf hdf5-1.8.14.tar.gz || exit 255
tar: This does not look like a tar archive
tar: Skipping to next header
tar: Exiting with failure status due to previous errors
make: *** [lib/libhdf5.a] Error 255

It looks like the error occurs at lines 84-90 of the makefile:

#
# If this library is a dependency the user wants HDF5 to be downloaded and built.
#
lib/libhdf5.a:
    if [ ! -e hdf5-1.8.14.tar.gz ]; then wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-1.8.14/src/hdf5-1.8.14.tar.gz; fi
    tar -xzf hdf5-1.8.14.tar.gz || exit 255
    cd hdf5-1.8.14 && ./configure --enable-threadsafe --prefix=`pwd`/.. && make && make install

I could completely be mistaken, but perhaps there was just a single letter missing in the unpacking of the .tar.gz file?

I amended the Makefile to unzip then unpack separately, and the installation completed without issue.

lib/libhdf5.a:
        if [ ! -e hdf5-1.8.14.tar.gz ]; then wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-1.8.14/src/hdf5-1.8.14.tar.gz; fi
        gunzip hdf5-1.8.14.tar.gz
        tar -xvf hdf5-1.8.14.tar || exit 255
        cd hdf5-1.8.14 && ./configure --enable-threadsafe --prefix=`pwd`/.. && make && make install

Perhaps this was just a quirk of my machine, but I thought I'd pass along the observation in case it helps others.

Thanks very much!

jts commented 6 years ago

tar -xzf should unpack gzipped tars. What does tar --version say on your machine?

devonorourke commented 6 years ago

Fun with compute clusters! Might be time to update our tar package... It's from 2011... Perhaps that's the issue.

If I execute tar --version I get the following:

tar (GNU tar) 1.26
Copyright (C) 2011 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>.
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

Written by John Gilmore and Jay Fenlason.
jts commented 6 years ago

tar should have had gzip support forever so I'm not sure thats the problem. Do you see the entry for the -z option in the man page?

devonorourke commented 6 years ago

Yes, it's listed in the manual:

Compression options:
       -a, --auto-compress
              use archive suffix to determine the compression program

       -I, --use-compress-program=PROG
              filter through PROG (must accept -d)

       -j, --bzip2
              filter the archive through bzip2

       -J, --xz
              filter the archive through xz

       --lzip filter the archive through lzip

       --lzma filter the archive through lzma

       --lzop

       --no-auto-compress
              do not use archive suffix to determine the compression program

       -z, --gzip, --gunzip, --ungzip
              filter the archive through gzip

       -Z, --compress, --uncompress
              filter the archive through compress

Strange... The only thing I can think of is I've occasionally seen (via other programs) our compute cluster have a broken path to pigz; nevertheless these programs find alternative compression packages. In this case we're specifying tar though, so I don't think that's the culprit.

Thanks for your troubleshooting (and, you know, creating nanopolish)