Closed Nicolas-Kolodziejczyk closed 9 months ago
@Nicolas-Kolodziejczyk
This fetch seems to work on my side, can you check if you have the last argopy version installed ? https://argopy.readthedocs.io/en/latest/install.html
I can fetch data for this region: This is the "status" for all chunked regions is "standard" mode, red is "no data", green is "found data": This is for 2012 Jan-June
Hello, I try to extract a (large) dataset in the Arctic with the following code:
from argopy import DataFetcher import xarray as xr
t1 = '2012' t2 = '2012' box = [-180.,180.,60.,89.,0,10.,t1+'-01-01',t2+'-12-31'] loader_par = DataFetcher(src='erddap', parallel=True, progress=True).region(box) ds = loader_par.to_xarray() ds.to_netcdf('ARGO_112023Arctic'+t1+'_'+t2+'.nc') ~ I got this error message :
100%|█████████████████████████████████████████| 180/180 [01:48<00:00, 1.66it/s] Traceback (most recent call last): File "fetch_argopy.py", line 10, in
ds = loader_par.to_xarray()
File "/home/nicolas/anaconda3/lib/python3.8/site-packages/argopy/fetchers.py", line 428, in to_xarray
xds = self.fetcher.to_xarray(**kwargs)
File "/home/nicolas/anaconda3/lib/python3.8/site-packages/argopy/data_fetchers/erddap_data.py", line 471, in to_xarray
ds = self.fs.open_mfdataset(
File "/home/nicolas/anaconda3/lib/python3.8/site-packages/argopy/stores/filesystems.py", line 663, in open_mfdataset
raise ValueError("Errors happened with all URLs, this could be due to an internal impossibility to read returned content.")
ValueError: Errors happened with all URLs, this could be due to an internal impossibility to read returned content.
Is the requested dataset too large ? Thanks Nicolas