Closed ghost closed 1 year ago
In the meantime, until this can properly be resolved, a couple thoughts for potential workarounds occurred to me:
import xarray as xr
url = 'http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc'
ds = xr. open_dataset(url)
which might not lead to the same error, that would be very much appreciated!
Using the current netcdf-c master, I cannot duplicate the error. Try changing the URL to this form to get some more information:
http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch
Update: I am pretty certain now that this issue is related to my local certification chains here. So I am going to close it for now. If I sort out the certification trouble and this issue persists, I will reopen it.
And thank you @DennisHeimbigner for the suggestion. I tried it various ways but didn't seem to get much new information. Here are the results:
P:\>conda activate EQ
(EQ) P:\>ncdump -h http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch
Error:curl error: SSL connect error
curl error details:
Warning:oc_open: Could not read url
ncdump: http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log: NetCDF: I/O failure
'show' is not recognized as an internal or external command,
operable program or batch file.
import xarray as xr
url = 'http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch'
ds = xr.open_dataset(url)
Note:Caching=1 Note:fetch: http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds Traceback (most recent call last):
File "C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key]
File "C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\lru_cache.py", line 53, in getitem value = self._cache[key]
KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\Codiga_D\AppData\Local\Temp\ipykernel_7324\1695649982.py", line 1, in
File "C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\api.py", line 499, in open_dataset **kwargs,
File "C:\Users\CodigaD\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\netCDF4.py", line 559, in open_dataset autoclose=autoclose,
File "C:\Users\CodigaD\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\netCDF4.py", line 379, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
File "C:\Users\CodigaD\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\netCDF4.py", line 327, in init self.format = self.ds.data_model
File "C:\Users\CodigaD\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\netCDF4.py", line 388, in ds return self._acquire()
File "C:\Users\CodigaD\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\netCDF4.py", line 382, in _acquire with self._manager.acquire_context(needs_lock) as root:
File "C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\lib\contextlib.py", line 112, in enter return next(self.gen)
File "C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock)
File "C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\lib\site-packages\xarray\backends\file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, **kwargs)
File "src\netCDF4_netCDF4.pyx", line 2353, in netCDF4._netCDF4.Dataset.init
File "src\netCDF4_netCDF4.pyx", line 1963, in netCDF4._netCDF4._ensure_nc_success
OSError: [Errno -68] NetCDF: I/O failure: b'http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch'
Error:curl error: SSL connect error curl error details: Warning:oc_open: Could not read url Note:fetch complete.
Reopening this. I got help sorting out some certification issues so those are resolved, but this issue is still a problem. We think it may be related to certificate chains.
How can I check which certificate file/chain is being used by ncdump? What about curl?
It may be that I need to configure ncdump and/or curl to use a certain certificate file (different from the ones currently being used) and/or append to the certificate file it/they are currently using.
First review the relevant curl documentation and see if anything strikes you as relevant to your situation.
Second, you can force set CAINFO and/or CAPATH by creating the file ~/.ncrc and putting either or both of these lines in it. This assumes a non-windows environment.
Will look in to this. Thanks for the quick reply Dennis. I am on windows though.
Ok, then you need to put the .ncrc file in %USERPROFILE% instead of $HOME.
I suspect it's something related to how netCDF-c is compiled (in particular within the netCDF-python release). I just did three experiments on a freshly started ubuntu:20.04
docker image:
python3 -m pip install netcdf4==1.6.0
does not work (with Warning:oc_open: Could not read url
). It comes with netCDF-c
@ 4.9.0python3 -m pip install "netcdf4<1.6.0"
resolves to 1.5.8 and does work. It comes with netCDF-c
@ 4.7.4So it's probably not a general issues with the version of netCDF-c, but maybe related to the build process. I however didn't figure out from where the netCDF-c which is packaged with Python comes from and how it is built....
@DennisHeimbigner I made a .ncrc file with HTTP.SSL.CAPATH in it, and put it in to C:\Users\Codiga_D (this is the correct %USERPROFILE% I hope?), but I still get the same error from ncdump. However, I am unsure of the syntax within the .ncrc file (quote/unquote path? forward/backward slashes?)... here is what I used (single quotes and backslashes):
HTTP.SSL.CAPATH='C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\Lib\site-packages\certifi\cacert.pem'
Guidance?
@d70-t I am in a conda env (not using pip at all) on windows, so I'm not sure if your tests are a close match to my case... but just to try something I pinned my netcdf4 to 1.5.8 (it was 1.6.0) and that did not solve the problem either.
Backward slash should be ok, but do not quote the value (i.e text after the equal sign).
Ok I removed the quotes. Error persists. Not 100% sure the .ncrc file is being used-- is there any kind of test I could do, to confirm that it is?
I was looking around and found this:
The CURLOPT_CAPATH function apparently does not work in Windows due to some limitation in openssl.
So try changing HTTP.SSL.CAPATH=... to HTTP.SSL.CAINFO=... but keeping the same value.
Also, you may be able to test .ncrc is being used by adding this line:
HTTP.VERBOSE=1
If you see a bunch of curl debug output, then .ncrc is being read.
Thanks Dennis. Tried that too (CAINFO instead of CAPATH), still no joy. On the bright side, including HTTP.VERBOSE=1 did cause additional messages to appear-- so that confirms the .ncrc file is being used. Here are the additional messages, in case helpful:
(EQ) PS C:\Users\Codiga_D> ncdump -h http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc
* Trying 140.172.38.12:80...
* Connected to psl.noaa.gov (140.172.38.12) port 80 (#0)
> GET /thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds HTTP/1.1
Host: psl.noaa.gov
User-Agent: oc4.8.1
Accept: */*
* Mark bundle as not supporting multiuse
< HTTP/1.1 301 Moved Permanently
< Server: nginx/1.23.1
< Date: Fri, 05 Aug 2022 19:40:36 GMT
< Content-Type: text/html
< Content-Length: 169
< Connection: keep-alive
< Location: https://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds
< X-Frame-Options: SAMEORIGIN
<
* Ignoring the response-body
* Connection #0 to host psl.noaa.gov left intact
* Clear auth, redirects to port from 80 to 443
* Issue another request to this URL: 'https://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds'
* Trying 140.172.38.12:443...
* Connected to psl.noaa.gov (140.172.38.12) port 443 (#1)
* schannel: disabled automatic use of client certificate
* ALPN: offers http/1.1
* schannel: added 142 certificate(s) from CA file 'C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\Lib\site-packages\certifi\cacert.pem'
* schannel: CertGetCertificateChain trust error CERT_TRUST_REVOCATION_STATUS_UNKNOWN
* Closing connection 1
Error:curl error: SSL peer certificate or SSH remote key was not OK
curl error details:
Warning:oc_open: Could not read url
C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\Library\bin\ncdump.exe: http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc: NetCDF: I/O failure
Error:curl error: SSL peer certificate or SSH remote key was not OK
This looks suspicious to me. Perhaps, the key in your known-hosts file for this site is out-of-date?
If you put this url in your browser, doe it succeed?
https://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds
What it gives me is shown below. I believe that is success?
Yes, that shows that the cert chain used by your browser is working for reading from that site. Just to be sure, retry with HTTP.SSL.CAINFO and/or HTTP.SSL.CAPATH removed from .ncrc.
Okay, I removed the HTTP.SSL.CAINFO from .ncrc file and reran ncdump, the result is same as shown above.
I guess I am out of ideas. BTW, the above says that certs are being added from
- schannel: added 142 certificate(s) from CA file 'C:\Users\Codiga_D\AppData\Local\Continuum\miniconda3\envs\EQ\Lib\site-packages\certifi\cacert.pem'
Any idea where those are coming from?
That file contains the certs that are used by various other tools including the python package "requests", which is working fine.
So thank you for all the help. At this point is there a way to summarize what we do and don't know about the situation? For example: "cert chain to the target site is working successfully in browser, other python packages such as requests are able to function fine, but something about cert chain(s) used by ncdump and/or curl causes the error". Is that more or less accurate? (I have minimal proficiency with certification issues. Just want to be able to explain what's happening, when seeking further help.)
The problem is not ncdump or other stand-alone use of the netcdf-c library. They work fine for me. The bottom is that the cert chains being used when python invokes netcdf-c is wrong somehow.
I confirm this problem with netCDF4==1.6.0 (Ubuntu 22.04)
After saying pip install netcdf4==1.5.7
problem disappeared, the resulting environment (that works fine) is:
pip freeze
certifi==2022.6.15
cftime==1.6.1
cycler==0.11.0
fonttools==4.37.1
kiwisolver==1.4.4
matplotlib==3.5.3
netCDF4==1.5.7
numpy==1.23.2
packaging==21.3
pandas==1.4.3
Pillow==9.2.0
pyparsing==3.0.9
pyproj==3.3.1
python-dateutil==2.8.2
pytz==2022.2.1
six==1.16.0
xarray==2022.6.0
Terveisin, Markus
After re-reading this chain, I note that I was wrong and that HTTP.SSL.CAINFO apparently does not take the same argument as HTTP.SSL.CAPATH. CAPATH points to the containing directory and CAINFO points to the actual cert file. So you might set up .ncrc with the correct HTTP.SSL.CAINFO and see what happens.
I tried some other permutations of CAPATH (path only) and CAINFO (filename only, or path with filename) but still no joy. I also tried HTTP.SSL.VERIFYPEER=0 and HTTP.SSL.VALIDATE=0 but no success. Thank you for circling back with the suggestion @DennisHeimbigner.
Sorry that we could not solve this. I am out of ideas.
I came to this thread because I was having this same issue accessing daymet using xarray in a conda environment (xr.open_dataset('https://thredds.daac.ornl.gov/thredds/dodsC/daymet-v4-agg/na.ncml'). Many other OPeNDAP endpoints worked but not daymet.
I'm working on WSL2 Ubuntu 22.04 and getting the same curl and netcdf error as @DanCodigaMWRA was experiencing in his conda example above using both netcdf4 v 1.6.1 and 1.6.0. Finally - 1.5.8 worked.
However, using curl in my terminal outside of my conda env I was able to access the daymet endpoint. Still curious, I added my ssl cert to conda `conda config --set ssl_verify <path/to/cert.crt> and now I can access using netcdf v1.6.1.
adding my experience in case someone else bumps into this thread.
Thanks @rmcd-mscb ! A good datapoint.
This week I ran into the same issue using pytest
on GitHub for automated testing. I created a demo test:
I created a demo test in my package test directory (tests/test_github_action.py
) as follows
from pydap.client import open_url
from netCDF4 import Dataset
import xarray as xr
url = 'https://icdc.cen.uni-hamburg.de/thredds/dodsC/ftpthredds/hamtide//m2.hamtide11a.nc'
def test_get_pydap():
ds = open_url(url)
print("OPENDAP", ds)
def test_open_w_nc4():
ds = Dataset(url)
print("NETCDF4", ds)
def test_open_w_xr():
ds = xr.open_dataset(url)
print("XARRAY", ds)
and my test environment is set up using:
name: demo
on: [push]
jobs:
mytest:
name: demo-test
runs-on: ubuntu-latest
steps:
- name: clone repository
uses: actions/checkout@v3
- name: install Python
uses: actions/setup-python@v4
- name: install pkgs
run: |
pip install netCDF4 xarray pytest pydap
- name: bare curl
run: |
curl \
-O https://icdc.cen.uni-hamburg.de/thredds/dodsC/ftpthredds/hamtide/m2.hamtide11a.nc \
-o /tmp/by_curl.nc
pwd
ls
- name: pytest
run: |
pytest tests/test_github_action.py
Using curl from terminal and pydap
from within the demo test I can download the .nc
file, but using either xarray
or netCDF4.Dataset
fails. This is a snippet of the GitHub action output:
During handling of the above exception, another exception occurred:
def test_open_w_xr():
> ds = xr.open_dataset(url)
tests/test_github_action.py:16:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../../.local/lib/python3.8/site-packages/xarray/backends/api.py:539: in open_dataset
backend_ds = backend.open_dataset(
../../../.local/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:555: in open_dataset
store = NetCDF4DataStore.open(
../../../.local/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:384: in open
return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
../../../.local/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:332: in __init__
self.format = self.ds.data_model
../../../.local/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:393: in ds
return self._acquire()
../../../.local/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:387: in _acquire
with self._manager.acquire_context(needs_lock) as root:
/usr/lib/python3.8/contextlib.py:113: in __enter__
return next(self.gen)
../../../.local/lib/python3.8/site-packages/xarray/backends/file_manager.py:189: in acquire_context
file, cached = self._acquire_with_cache_info(needs_lock)
../../../.local/lib/python3.8/site-packages/xarray/backends/file_manager.py:207: in _acquire_with_cache_info
file = self._opener(*self._args, **kwargs)
src/netCDF4/_netCDF4.pyx:24[63](https://github.com/noaa-ocs-modeling/adcircpy/actions/runs/3251212919/jobs/5335909325#step:6:64): in netCDF4._netCDF4.Dataset.__init__
???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> ???
E OSError: [Errno -[68](https://github.com/noaa-ocs-modeling/adcircpy/actions/runs/3251212919/jobs/5335909325#step:6:69)] NetCDF: I/O failure: b'https://icdc.cen.uni-hamburg.de/thredds/dodsC/ftpthredds/hamtide//m2.hamtide11a.nc'
src/netCDF4/_netCDF4.pyx:2026: OSError
----------------------------- Captured stderr call -----------------------------
Note:Caching=1
Error:curl error: Problem with the SSL CA cert (path? access rights?)
curl error details:
Warning:oc_open: Could not read url
I don't have any issues on my local machine! If I pin the netCDF4
version to <1.6
the test goes through!
I might remove this branch, but currently this is where I'm testing: https://github.com/noaa-ocs-modeling/adcircpy/tree/test/hamtide_fail
and the failing action: https://github.com/noaa-ocs-modeling/adcircpy/actions/runs/3251212919/jobs/5335909325
and this is the one with pinned version: https://github.com/noaa-ocs-modeling/adcircpy/actions/runs/3251735536/jobs/5337126739
I put this url into my web browser:
http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dods
This is equivalent to what ncdump is doing internally to get the data to dump.
I got back this message:
Error {
code = 403;
message = "Request too big=1133.064428 Mbytes, max=500.0";
};
Thx all, for the input on this. Some minor updates to my situation now: am using netcdf4 1.6.1, curl/libcurl 7.85, and certifi 2022.9.24.
@rmcd-mscb:
@SorooshMani-NOAA:
@DennisHeimbigner: I got the same response to that ".dods" URL in my browser. How does that relate?
After saying
pip install netcdf4==1.5.7
problem disappeared, the resulting environment (that works fine) is:
I was running into this issue on Google Colab and was driving me bonkers. Installing the above version of netcdf4 (prior to xarray[io]) did the trick. Thank you!
I am baffled. When I try this command using the current netcdf-c master ncdump, it works fine for me. I see no curl errors or cert errors.
ncdump -h 'http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc'
Could one of you having the problem could try this ncdump command and post the resulting output?
ncdump -h 'http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch'
Sure thing, @DennisHeimbigner, see at bottom below! And thanks for continuing to try remedies.
@pauliniguez , I am 99% sure I have tried rolling netcdf4 back to 1.5.7 without success resolving the problem. But I may try again...
Separately, I have made some progress on a workaround to avoid using netcdf4, which seems possible using pydap per the suggestions from @SorooshMani-NOAA above.
(EQ) PS C:\Users\codiga_d> ncdump -h 'http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch'
Note:fetch: http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds
* Trying 140.172.38.12:80...
* Connected to psl.noaa.gov (140.172.38.12) port 80 (#0)
> GET /thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds HTTP/1.1
Host: psl.noaa.gov
User-Agent: oc4.8.1
Accept: */*
* Mark bundle as not supporting multiuse
< HTTP/1.1 301 Moved Permanently
< Server: nginx/1.23.2
< Date: Mon, 07 Nov 2022 15:23:08 GMT
< Content-Type: text/html
< Content-Length: 169
< Connection: keep-alive
< Location: https://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds
< X-Frame-Options: SAMEORIGIN
<
* Ignoring the response-body
* Connection #0 to host psl.noaa.gov left intact
* Clear auth, redirects to port from 80 to 443
* Issue another request to this URL: 'https://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc.dds'
* Trying 140.172.38.12:443...
* Connected to psl.noaa.gov (140.172.38.12) port 443 (#1)
* schannel: disabled automatic use of client certificate
* ALPN: offers http/1.1
* schannel: added 143 certificate(s) from CA file 'C:\Users\codiga_d\Anaconda3\envs\EQ\Lib\site-packages\certifi\cacert.pem'
* schannel: CertGetCertificateChain trust error CERT_TRUST_REVOCATION_STATUS_UNKNOWN
* Closing connection 1
Error:curl error: SSL peer certificate or SSH remote key was not OK
curl error details:
Warning:oc_open: Could not read url
Note:fetch complete.
C:\Users\codiga_d\Anaconda3\envs\EQ\Library\bin\ncdump.exe: http://psl.noaa.gov/thredds/dodsC/Datasets/NARR/monolevel/uwnd.10m.2000.nc#log&show=fetch: NetCDF: I/O failure
- schannel: CertGetCertificateChain trust error CERT_TRUST_REVOCATION_STATUS_UNKNOWN
Has anyone tried fixing this problem? I did a google search on "curl trust error CERT_TRUST_REVOCATION_STATUS_UNKNOWN" and it brought up some suggestions about how to fix it.
Another data point, I have the same behavior through the NetCDF-C installed with RNetCDF 2.6.1. It's on NetCDF-C 4.9.0.
https://cran.r-project.org/package=RNetCDF
> RNetCDF::open.nc("https://cida.usgs.gov/thredds/dodsC/prism#log&show=fetch")
#> Note:Caching=1
#> Note:fetch: https://cida.usgs.gov/thredds/dodsC/prism.dds
#> Error:curl error: SSL peer certificate or SSH remote key was not OK
#> curl error details:
#> Warning:oc_open: Could not read url
#> Note:fetch complete: 0.163 secs
#> Error in RNetCDF::open.nc("https://cida.usgs.gov/thredds/dodsC/prism#log&show=fetch") :
#> NetCDF: I/O failure
My system NetCDF install works fine but this one, which is pre-compiled and installed in an R library is causing the issue.
I confirm this problem with netCDF4==1.6.0 (Ubuntu 22.04)
After saying
pip install netcdf4==1.5.7
problem disappeared, the resulting environment (that works fine) is:pip freeze certifi==2022.6.15 cftime==1.6.1 cycler==0.11.0 fonttools==4.37.1 kiwisolver==1.4.4 matplotlib==3.5.3 netCDF4==1.5.7 numpy==1.23.2 packaging==21.3 pandas==1.4.3 Pillow==9.2.0 pyparsing==3.0.9 pyproj==3.3.1 python-dateutil==2.8.2 pytz==2022.2.1 six==1.16.0 xarray==2022.6.0
Terveisin, Markus
thanks, man, thanks. You save my code . It lib is a legacy code
I just came to this thread. I am trying the following toolchain:
- python==3.8
- certifi==2022.6.15
- xarray==2022.11.0
- netcdf4==1.5.7
I am getting the same behavior on this dataset ncdump https://opendap.co-ops.nos.noaa.gov/thredds/dodsC/NOAA/WCOFS/MODELS/2023/01/13/nos.wcofs.regulargrid.n015.20230113.t03z.nc
as well as with netcdf4 and xarray of course. I am able to open other opendap datasources.
Eli
We finally seem to have got to the bottom of this internally here and determined that it was a firewall issue, solved by SSL inspection exemptions I believe. Thanks for all the help trying to figure it out!
Firewall issues can be pernicious, glad you were able to track that down!
I'm in a conda environment on Windows and ncdump is giving me a "could not read url" error (oc_open, SSL connect) when accessing a file on a remote server:
Others report success with this operation so the trouble seems to be with my own system. These seem relevant:
The error first occurred for me using xarray,
but has been shown to occur in netcdf4 and ncdump as well.
At least several months ago I did not have this problem. I don't have details as to all the changes to my system have occurred since then, but one recent change had to do with certificate chains so I was guessing it might be related to that.
These issues might be related? https://github.com/Unidata/netcdf-c/issues/1393 https://github.com/Unidata/netcdf-c/issues/1833 https://github.com/Unidata/netcdf4-python/issues/755 https://github.com/pydata/xarray/issues/4925
Earlier posts I have made on this, with some additional relevant information: https://github.com/pydata/xarray/issues/6766 https://github.com/pydata/xarray/discussions/6742
Thank you in advance.