Closed hanamthang closed 2 years ago
Hi Thang
There is a similar issue reported on the forum: https://odnature.naturalsciences.be/remsem/acolite-forum/viewtopic.php?f=4&t=301 Can you run the example code in that thread?
I haven't been able to reproduce this error myself.
Quinten
I ran the example code, however still receive the same error:
ds = netCDF4.Dataset(url) Error:curl error: Unsupported protocol curl error details: Warning:oc_open: Could not read url Traceback (most recent call last): File "<stdin>", line 1, in <module> File "src/netCDF4/_netCDF4.pyx", line 2307, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1925, in netCDF4._netCDF4._ensure_nc_success OSError: [Errno -68] NetCDF: I/O failure: b'https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc'
I ran the tact.py independently, still got the same error:
Launching TACT processing. /run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/tact/tact/metadata_read.py:15: SyntaxWarning: "is not" with a literal. Did you mean "!="? if len(split) is not 2: continue Retrieving a/c parameters Error:curl error: Unsupported protocol curl error details: Warning:oc_open: Could not read url Traceback (most recent call last): File "/run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/tact/tact.py", line 45, in <module> run_tact() File "/run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/tact/tact.py", line 41, in run_tact tact.tact_landsat(args.input, output=args.output, limit=args.limit, export_geotiff=args.export_geotiff) File "/run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/tact/tact/tact_landsat.py", line 52, in tact_landsat simst, lonc, latc = tact.tact_limit(isodate, sc_limit, c_time, verbosity=0, satsen=satsen, processes=4) File "/run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/tact/tact/tact_limit.py", line 71, in tact_limit ds = netCDF4.Dataset(url) File "src/netCDF4/_netCDF4.pyx", line 2307, in netCDF4._netCDF4.Dataset.__init__ File "src/netCDF4/_netCDF4.pyx", line 1925, in netCDF4._netCDF4._ensure_nc_success OSError: [Errno -68] NetCDF: I/O failure: b'https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202103/e5.oper.an.pl.128_157_r.ll025sc.2021033100_2021033123.nc'
When I tried directly the link: "https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc"
the following error appears:
Error {
code = 400;
message = "Unrecognized request";
};
So can be the problem with the link?
I used Arch Linux and same python environment.
I successfully compiled and executed the ACOLITE with TACT option for Landat image in the linux Lubuntu distro. Despite, I can not do the same for the arch linux Manjaro distro. It still throw the error when compiling the libRadtran in python27 environment:
compiling sofi.c linking ../bin/sofi /usr/bin/ld: ../lib/libRadtran_c.a(uvspecrandom.o):/run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/acolite/external/libRadtran-2.0.2/libsrc_c/uvspecrandom.c:39: multiple definition of
uvspecrng'; ../lib/libRadtran_c.a(mystic.o):/run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/acolite/external/libRadtran-2.0.2/libsrc_c/mystic.c:145: first defined here
collect2: error: ld returned 1 exit status
make[1]: [Makefile:234: ../bin/sofi] Error 1
make[1]: Leaving directory '/run/media/metal/d6d05414-194e-4f08-bb20-754d8b61789f/home/metal/Documents/softl/atmospheric/acolite/external/libRadtran-2.0.2/src'
make: [Makefile:39: all] Error 2
`
So, I think the problem might regrading the distro itself.
Thanks for the update Thang. I opened a new issue for the compilation error.
The rda.ucar URL is not supposed to be directly accessed through a web browser, so the "Unrecognised request" error is not unexpected. Your cURL may be configured/compiled without SSL support. Could you check?
Quinten
I checked the version and think curl support SSL (in protocols?)
curl 7.79.0 (x86_64-conda-linux-gnu) libcurl/7.79.0 OpenSSL/1.1.1l zlib/1.2.11 libssh2/1.10.0 nghttp2/1.43.0
Release-Date: 2021-09-15
Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp scp sftp smb smbs smtp smtps telnet tftp
Features: alt-svc AsynchDNS GSS-API HSTS HTTP2 HTTPS-proxy IPv6 Kerberos Largefile libz NTLM NTLM_WB SPNEGO SSL TLS-SRP UnixSockets
Does accessing via ncdump work?
e.g.
ncdump -h https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc
It seems not work. I tried your suggestion but nothing happen.
ncdump -h https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc
Perhaps there is an issue with your NetCDF installation?
When running the ncdump -h I get:
netcdf e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623 { dimensions: time = UNLIMITED ; // (24 currently) latitude = 721 ; level = 37 ; longitude = 1440 ; variables: double latitude(latitude) ; latitude:long_name = "latitude" ; latitude:short_name = "lat" ; latitude:units = "degrees_north" ; latitude:_ChunkSizes = 721 ; double level(level) ; level:long_name = "pressure level" ; level:short_name = "plev" ; level:units = "hPa" ; level:alternate_units = "millibar" ; level:_ChunkSizes = 37 ; double longitude(longitude) ; longitude:long_name = "longitude" ; longitude:short_name = "lon" ; longitude:units = "degrees_east" ; longitude:_ChunkSizes = 1440 ; int time(time) ; time:long_name = "time" ; time:units = "hours since 1900-01-01 00:00:00" ; time:calendar = "gregorian" ; time:_ChunkSizes = 1024 ; int utc_date(time) ; utc_date:long_name = "UTC date yyyy-mm-dd hh:00:00 as yyyymmddhh" ; utc_date:units = "Gregorian_year month day hour" ; utc_date:_ChunkSizes = 1024 ; float R(time, level, latitude, longitude) ; R:long_name = "Relative humidity" ; R:short_name = "r" ; R:units = "%" ; R:original_format = "WMO GRIB 1 with ECMWF local table" ; R:ecmwf_local_table = 128 ; R:ecmwf_parameter = 157 ; R:_FillValue = 9.999e+20f ; R:missing_value = 9.999e+20f ; R:minimum_value = -10.28709f ; R:maximum_value = 178.1241f ; R:grid_specification = "0.25 degree x 0.25 degree from 90N to 90S and 0E to 359.75E (721 x 1440 Latitude/Longitude)" ; R:rda_dataset = "ds633.0" ; R:rda_dataset_url = "https:/rda.ucar.edu/datasets/ds633.0/" ; R:rda_dataset_doi = "DOI: 10.5065/BH6N-5N20" ; R:rda_dataset_group = "ERA5 atmospheric pressure level analysis [netCDF4]" ; R:number_of_significant_digits = 7 ; R:_ChunkSizes = 1, 37, 721, 1440 ;
// global attributes: :DATA_SOURCE = "ECMWF: https://cds.climate.copernicus.eu, Copernicus Climate Data Store" ; :NETCDF_CONVERSION = "CISL RDA: Conversion from ECMWF GRIB 1 data to netCDF4." ; :NETCDF_VERSION = "4.7.3" ; :CONVERSION_PLATFORM = "Linux r13i7n31 4.12.14-95.37.1.18642.1.PTF.1155508-default #1 SMP Mon Nov 4 13:03:21 UTC 2019 (9cc1377) x86_64 x86_64 x86_64 GNU/Linux" ; :CONVERSION_DATE = "Tue Jun 30 13:35:49 MDT 2020" ; :Conventions = "CF-1.6" ; :NETCDF_COMPRESSION = "NCO: Precision-preserving compression to netCDF4/HDF5 (see \"history\" and \"NCO\" global attributes below for specifics)." ; :history = "Tue Jun 30 13:36:06 2020: ncks -4 --ppc default=7 e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.unc.nc e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc" ; :NCO = "netCDF Operators version 4.7.9 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)" ; :DODS_EXTRA.Unlimited_Dimension = "time" ; }
ncdump [-c|-h] [-v ...] [[-b|-f] [c|f]] [-l len] [-n name] [-p n[,n]] [-k] [-x] [-s] [-t|-i] [-g ...] [-w] [-Ln] file
[-c] Coordinate variable data and header information
[-h] Header information only, no data
[-v var1[,...]] Data for variable(s) <var1>,... only
[-b [c|f]] Brief annotations for C or Fortran indices in data
[-f [c|f]] Full annotations for C or Fortran indices in data
[-l len] Line length maximum in data section (default 80)
[-n name] Name for netCDF (default derived from file name)
[-p n[,n]] Display floating-point values with less precision
[-k] Output kind of netCDF file
[-s] Output special (virtual) attributes
[-t] Output time data as date-time strings
[-i] Output time data as date-time strings with ISO-8601 'T' separator
[-g grp1[,...]] Data and metadata for group(s) <grp1>,... only
[-w] With client-side caching of variables for DAP URLs
[-x] Output XML (NcML) instead of CDL
[-Xp] Unconditionally suppress output of the properties attribute
[-Ln] Set log level to n (>= 0); ignore if logging not enabled.
file Name of netCDF file (or URL if DAP access enabled)
netcdf library version 4.8.1 of Oct 1 2021 20:01:11 $
I am not sure whether something happened to my netCDF since nothing happened when I read the netCDF file, import or export.
Ah, I got result, but failed. One again, still curl:
ncdump -h https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc
Error:curl error: Timeout was reached
curl error details:
Warning:oc_open: Could not read url
ncdump: https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc: NetCDF: I/O failure
Are you using the ncdump from conda-forge, or another version?
The ncdump
is a part of the netcdf
package and was installed with the software manager in Manjaro. I tried to install the libnetCDF
from conda-forge channel
, however still not work. When run ncdump
from python environment, same errors still:
ncdump -h https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc
Error:curl error: Unsupported protocol
curl error details:
Warning:oc_open: Could not read url
ncdump: https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc: NetCDF: I/O failure
Can you try the one from conda-forge?
I tried the netCDF4
from conda-forge
and tested your codes:
import netCDF4
url = 'https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc'
ds = netCDF4.Dataset(url)`
but errors:
url = 'https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc'
>>> ds = netCDF4.Dataset(url)
Error:curl error: Unsupported protocol
curl error details:
Warning:oc_open: Could not read url
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "src/netCDF4/_netCDF4.pyx", line 2307, in netCDF4._netCDF4.Dataset.__init__
File "src/netCDF4/_netCDF4.pyx", line 1925, in netCDF4._netCDF4._ensure_nc_success
OSError: [Errno -68] NetCDF: I/O failure: b'https://rda.ucar.edu/thredds/dodsC/files/g/ds633.0/e5.oper.an.pl/202004/e5.oper.an.pl.128_157_r.ll025sc.2020040600_2020040623.nc'
There is no netcdf
package in Anaconda Cloud. And seem we can not call ncdump
from netCDF4
.
Still can not discover what happened in this case.
Thang
Can you print the NetCDF version and location?
import netCDF4
print(netCDF4.__version__)
print(netCDF4.__file__)
Can you locate your ncdump? (which ncdump) If it is not the one from your conda environment you can also call that one directly (use the right path and env name): /path/to/your/miniconda3/envs/your_env/bin/ncdump
If you are interested I have a Docker image available that can run TACT, perhaps it is worth testing that as well: https://hub.docker.com/repository/docker/acolite/acolite
Quinten
Here is the info:
>>> import netCDF4
>>> print(netCDF4.__version__)
1.5.8
>>> print(netCDF4.__file__)
/home/metal/anaconda3/envs/thang/lib/python3.9/site-packages/netCDF4/__init__.py
which ncdmp
showed:
/home/metal/anaconda3/envs/thang/lib/python3.9/site-packages/netCDF4/__init__.py
Thank you for your suggestion. I still can do the TACT with the Lubuntu distro, but the Manjaro laptop is more stronger in computation, so try to do the TACT in Manjaro. And I am not familiar with the Docker before :-))))
Hi Thang
The issue may lie with your local configuration or network connection, as I cannot replicate your issue even on Manjaro:
I installed a Manjaro VM using a minimal xfce image (manjaro-xfce-21.1.6-minimal-211017-linux54.iso) in VirtualBox, and updated the system with sudo pacman -Syu.
I installed miniconda using the latest Miniconda image (http://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh), created and activated the acolite environment, and installed the dependencies from conda-forge as in the README to this repository.
Both ncdump and the Python code work as they should and show the data can be retrieved from rda.ucar.edu
Let me know if you could run Docker or were able to fix this on your Manjaro installation!
Quinten
Thank you Quinten for your update. I will make a new environment with acolite and try the TACT again. Or if not possible, will test the Docker on the Manjaro.
Many thanks, Thang
Hi Quinten,
I just created a fresh environment in anaconda
with both version of Python 3.9 and 3.10 under the name `acolite'. Surprisingly, both fresh version worked with your sample code in retrieving the netcdf header information. So, there might be something "wrong" with my current anaconda environment.
Do you think of places that I should start to have a check or should I only delete the current environment and start with a new fresh one? I used this env only for acolite implementation and my daily work with remote sensing.
Thang
Hi Thang
Thanks for the update, so it was caused by local software and configuration, and I will close this Issue.
Since it is easy to manage different environments I would keep the one for your daily work as is, and use a separate new one for ACOLITE/TACT.
Quinten
Could you pls help me with this?
Is it regarding the libRad or the curl, libcurl?
Many thanks, Thang