pacificclimate / pdp

The PCIC Data Portal - Server software to run the entire web application
GNU General Public License v3.0
1 stars 2 forks source link

How to download the full raster domain #21

Closed hydropatch closed 9 years ago

hydropatch commented 9 years ago

Hello,

I am having trouble downloading data from the PCIC data portal "Gridded Hydrologic Model Data". Specifically, I need to download the entire dataset for one variable (SWE) for each of 8 model runs, for each of A2 and B1 scenarios, for 1950-2100, across the entire area (ie. Fraser, Peace, Upper Columbia, and Campbell River). Unfortunately, I have run into some problems using the PCIC data portal - specifically, when I try to select the entire modelled dataset (i.e. a single dataset for each model/scenario for the entire geographic area) I receive an error message and am forced to zoom in and download a smaller geographical area.

Would you be able to make this data available for the entire geographical extent (or for the Peace, Upper Columbia, Fraser and Campbell basins individually) or is there a way to use the Data Portal to get what I need?

jameshiebert commented 9 years ago

Hi Patrick,

Downloading the full domain for one of the climate rasters is fairly straight forward.

First of all, to get the URLs for your datasets of interest (A2 and B1 scenarios), just make a request to our data catolog: http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/

With its response, it's easy to filter and select the data sets you want:

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ | grep "\(A2\|B1\)"
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  3904  100  3904    0     0  11369      0 --:--:-- --:--:-- --:--:-- 11381
    "5var_day_HadCM_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_B1_run1_19500101-20981231.nc",
    "5var_day_GFDL2.1_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_B1_run1_19500101-20991231.nc",
    "5var_day_CGCM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_A2_run1_19500101-20991231.nc",
    "5var_day_ECHAM5_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_B1_run1_19500101-20981231.nc",
    "5var_day_MIROC3.2_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_B1_run1_19500101-20991231.nc",
    "5var_day_HadCM_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_A2_run1_19500101-20991231.nc",
    "5var_day_CSIRO35_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc",
    "5var_day_CCSM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc",
    "5var_day_HadGEM1_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc",
    "5var_day_CGCM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_B1_run1_19500101-20991231.nc",
    "5var_day_CSIRO35_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_B1_run1_19500101-20981231.nc",
    "5var_day_GFDL2.1_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc",
    "5var_day_MIROC3.2_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc",
    "5var_day_ECHAM5_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc",
    "5var_day_CCSM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc"

The response is JSON, so you can process it in most any programming lanugage as well.

With each of those URLs, it's easy to request the full domain. If you want all of the hydrologic variables, you can just request those URLs directly. If you really only want SWE, just add .nc?swe to each base URL. For example: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe

If at some point in the future, you do want some custom hyperslab of the data, you can request the raster shape of each dataset by adding .dds to the catalog URL.

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/5var_day_CCSM3_A1B_run1_19500101-20991231.nc.dds?swe
Dataset {
    Grid {
        Array:
            Float32 swe[time = 54787][lat = 163][lon = 215];
        Maps:
            Int32 time[time = 54787];
            Float64 lat[lat = 163];
            Float64 lon[lon = 215];
    } swe;
} 5var_day_CCSM3_A1B_run1_19500101-20991231%2Enc;

This shows you the number of time steps, and number of grid cell on both axes. Then if you just want a subdomain, you would request something like this: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe[0:100][0:160][0:200]

That would give you the first 101 time steps, and a 161x201 spatial domain.

Hope that helps.

hydropatch commented 9 years ago

Thanks James,

That should work well!

Patrick

Patrick Little MSc, AAg

On Mar 18, 2015, at 6:56 PM, James Hiebert notifications@github.com wrote:

Hi Patrick,

Downloading the full domain for one of the climate rasters is fairly straight forward.

First of all, to get the URLs for your datasets of interest (A2 and B1 scenarios), just make a request to our data catolog: http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ With its response, it's easy to filter and select the data sets you want:

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ | grep "(A2|B1)" % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 3904 100 3904 0 0 11369 0 --:--:-- --:--:-- --:--:-- 11381 "5var_day_HadCM_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_B1_run1_19500101-20981231.nc", "5var_day_GFDL2.1_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_B1_run1_19500101-20991231.nc", "5var_day_CGCM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_A2_run1_19500101-20991231.nc", "5var_day_ECHAM5_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_B1_run1_19500101-20981231.nc", "5var_day_MIROC3.2_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_B1_run1_19500101-20991231.nc", "5var_day_HadCM_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_A2_run1_19500101-20991231.nc", "5var_day_CSIRO35_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc", "5var_day_CCSM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc", "5var_day_HadGEM1_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc", "5var_day_CGCM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_B1_run1_19500101-20991231.nc", "5var_day_CSIRO35_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_B1_run1_19500101-20981231.nc", "5var_day_GFDL2.1_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc", "5var_day_MIROC3.2_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc", "5var_day_ECHAM5_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc", "5var_day_CCSM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc" The response is JSON, so you can process it in most any programming lanugage as well.

With each of those URLs, it's easy to request the full domain. If you want all of the hydrologic variables, you can just request those URLs directly. If you really only want SWE, just add .nc?swe to each base URL. For example: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe If at some point in the future, you do want some custom hyperslab of the data, you can request the raster shape of each dataset by adding .dds to the catalog URL.

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/5var_day_CCSM3_A1B_run1_19500101-20991231.nc.dds?swe Dataset { Grid { Array: Float32 swe[time = 54787][lat = 163][lon = 215]; Maps: Int32 time[time = 54787]; Float64 lat[lat = 163]; Float64 lon[lon = 215]; } swe; } 5var_day_CCSM3_A1B_run1_19500101-20991231%2Enc; This shows you the number of time steps, and number of grid cell on both axes. Then if you just want a subdomain, you would request something like this: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe[0:100][0:160][0:200] http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe%5B0:100%5D%5B0:160%5D%5B0:200%5D That would give you the first 101 time steps, and a 161x201 spatial domain.

Hope that helps.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83263100.

hydropatch commented 9 years ago

Thanks for these links James

I was able to begin downloads for 10 out of 15 of the model/scenario outputs, however a few of them gave me a “500 Internal Server Error”. For some reason, these 5 gave the error and would not download. Any ideas? http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc.nc?swe

Thanks,

Patrick

On Mar 18, 2015, at 6:56 PM, James Hiebert notifications@github.com wrote:

Hi Patrick,

Downloading the full domain for one of the climate rasters is fairly straight forward.

First of all, to get the URLs for your datasets of interest (A2 and B1 scenarios), just make a request to our data catolog: http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ With its response, it's easy to filter and select the data sets you want:

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ | grep "(A2|B1)" % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 3904 100 3904 0 0 11369 0 --:--:-- --:--:-- --:--:-- 11381 "5var_day_HadCM_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_B1_run1_19500101-20981231.nc", "5var_day_GFDL2.1_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_B1_run1_19500101-20991231.nc", "5var_day_CGCM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_A2_run1_19500101-20991231.nc", "5var_day_ECHAM5_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_B1_run1_19500101-20981231.nc", "5var_day_MIROC3.2_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_B1_run1_19500101-20991231.nc", "5var_day_HadCM_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_A2_run1_19500101-20991231.nc", "5var_day_CSIRO35_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc", "5var_day_CCSM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc", "5var_day_HadGEM1_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc", "5var_day_CGCM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_B1_run1_19500101-20991231.nc", "5var_day_CSIRO35_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_B1_run1_19500101-20981231.nc", "5var_day_GFDL2.1_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc", "5var_day_MIROC3.2_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc", "5var_day_ECHAM5_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc", "5var_day_CCSM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc" The response is JSON, so you can process it in most any programming lanugage as well.

With each of those URLs, it's easy to request the full domain. If you want all of the hydrologic variables, you can just request those URLs directly. If you really only want SWE, just add .nc?swe to each base URL. For example: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe If at some point in the future, you do want some custom hyperslab of the data, you can request the raster shape of each dataset by adding .dds to the catalog URL.

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/5var_day_CCSM3_A1B_run1_19500101-20991231.nc.dds?swe Dataset { Grid { Array: Float32 swe[time = 54787][lat = 163][lon = 215]; Maps: Int32 time[time = 54787]; Float64 lat[lat = 163]; Float64 lon[lon = 215]; } swe; } 5var_day_CCSM3_A1B_run1_19500101-20991231%2Enc; This shows you the number of time steps, and number of grid cell on both axes. Then if you just want a subdomain, you would request something like this: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe[0:100][0:160][0:200] http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe%5B0:100%5D%5B0:160%5D%5B0:200%5D That would give you the first 101 time steps, and a 161x201 spatial domain.

Hope that helps.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83263100.

hydropatch commented 9 years ago

Hi James,

Unfortunately now only 3 out of the 15 are downloading. I had paused the downloads for all but 3 in order to let those 3 download faster. Now when I tried the restart the downloads I get the same 500 Internal Server Error with the sea level rise house picture. So, now the ones that are working are:

"5var_day_HadCM_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_B1_run1_19500101-20981231.nc <http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_B1_run1_19500101-20981231.nc>.nc?swe",
"5var_day_GFDL2.1_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_B1_run1_19500101-20991231.nc <http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_B1_run1_19500101-20991231.nc>.nc?swe”
"5var_day_CCSM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc <http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc>.nc?swe"

But it looks like all the rest are now inaccessible?

Thanks for your help

Patrick

On Mar 19, 2015, at 9:56 AM, Patrick Little plittle@watersmith.ca wrote:

Thanks for these links James

I was able to begin downloads for 10 out of 15 of the model/scenario outputs, however a few of them gave me a “500 Internal Server Error”. For some reason, these 5 gave the error and would not download. Any ideas? http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc.nc?swe

Thanks,

Patrick

On Mar 18, 2015, at 6:56 PM, James Hiebert <notifications@github.com mailto:notifications@github.com> wrote:

Hi Patrick,

Downloading the full domain for one of the climate rasters is fairly straight forward.

First of all, to get the URLs for your datasets of interest (A2 and B1 scenarios), just make a request to our data catolog: http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ With its response, it's easy to filter and select the data sets you want:

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/ | grep "(A2|B1)" % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 3904 100 3904 0 0 11369 0 --:--:-- --:--:-- --:--:-- 11381 "5var_day_HadCM_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_B1_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_B1_run1_19500101-20981231.nc", "5var_day_GFDL2.1_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_B1_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_B1_run1_19500101-20991231.nc", "5var_day_CGCM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_A2_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_A2_run1_19500101-20991231.nc", "5var_day_ECHAM5_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_B1_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_B1_run1_19500101-20981231.nc", "5var_day_MIROC3.2_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_B1_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_B1_run1_19500101-20991231.nc", "5var_day_HadCM_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_A2_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadCM_A2_run1_19500101-20991231.nc", "5var_day_CSIRO35_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc", "5var_day_CCSM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_B1_run1_19500101-20991231.nc", "5var_day_HadGEM1_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_HadGEM1_A2_run1_19500101-20981231.nc", "5var_day_CGCM3_B1_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_B1_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_B1_run1_19500101-20991231.nc", "5var_day_CSIRO35_B1_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_B1_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_B1_run1_19500101-20981231.nc", "5var_day_GFDL2.1_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_GFDL2.1_A2_run1_19500101-20991231.nc", "5var_day_MIROC3.2_A2_run1_19500101-20981231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_A2_run1_19500101-20981231.nc", "5var_day_ECHAM5_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_ECHAM5_A2_run1_19500101-20991231.nc", "5var_day_CCSM3_A2_run1_19500101-20991231": "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc" The response is JSON, so you can process it in most any programming lanugage as well.

With each of those URLs, it's easy to request the full domain. If you want all of the hydrologic variables, you can just request those URLs directly. If you really only want SWE, just add .nc?swe to each base URL. For example: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe If at some point in the future, you do want some custom hyperslab of the data, you can request the raster shape of each dataset by adding .dds to the catalog URL.

hiebert@aether:~$ curl http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/5var_day_CCSM3_A1B_run1_19500101-20991231.nc.dds http://tools.pacificclimate.org/dataportal/hydro_model_out/catalog/5var_day_CCSM3_A1B_run1_19500101-20991231.nc.dds?swe Dataset { Grid { Array: Float32 swe[time = 54787][lat = 163][lon = 215]; Maps: Int32 time[time = 54787]; Float64 lat[lat = 163]; Float64 lon[lon = 215]; } swe; } 5var_day_CCSM3_A1B_run1_19500101-20991231%2Enc; This shows you the number of time steps, and number of grid cell on both axes. Then if you just want a subdomain, you would request something like this: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe[0:100][0:160][0:200] http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe%5B0:100%5D%5B0:160%5D%5B0:200%5D That would give you the first 101 time steps, and a 161x201 spatial domain.

Hope that helps.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83263100.

basilveerman commented 9 years ago

Due to the computation required and our current hardware limitations, our current setup only allows for up to 10 simultaneous downloads. Typically this is not an issue because your browser, depending on setup, would queue after 3-6 requests. When you hit all 10 at once, it stalls further data requests. The server should simply recover after those are complete, but this does not appear to have happened. I will investigate and let you know how/when to proceed.

-Basil

hydropatch commented 9 years ago

Thanks Basil,

I see. Sorry about that, I will avoid starting more than 4 or 5 downloads in the future!

Please let me know when the system is back up and running.

Cheers Patrick

On Mar 19, 2015, at 10:27 AM, Basil Veerman notifications@github.com wrote:

Due to the computation required and our current hardware limitations, our current setup only allows for up to 10 simultaneous downloads. Typically this is not an issue because your browser, depending on setup, would queue after 3-6 requests. When you hit all 10 at once, it stalls further data requests. The server should simply recover after those are complete, but this does not appear to have happened. I will investigate and let you know how/when to proceed.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83684954.

hydropatch commented 9 years ago

FYI, (you may already know this), all downloading has ceased now.

Cheers Patrick

On Mar 19, 2015, at 10:27 AM, Basil Veerman notifications@github.com wrote:

Due to the computation required and our current hardware limitations, our current setup only allows for up to 10 simultaneous downloads. Typically this is not an issue because your browser, depending on setup, would queue after 3-6 requests. When you hit all 10 at once, it stalls further data requests. The server should simply recover after those are complete, but this does not appear to have happened. I will investigate and let you know how/when to proceed.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83684954.

basilveerman commented 9 years ago

Hi Patrick,

I have restarted the server processes and downloads are now functional.

With your requests, we can serve data at about 10MB/s, (100mbit) per connection. this would get you the ~7GB file in just over 10 minutes. I'm going to guess that the network connection between us would become a bottleneck long before we reach that maximum. If you start downloading one file and you are getting 10MB/s, then try 2. If you are not getting 10MB/s, then adding additional downloads will not actually get you data any faster.

Hope all goes well, Basil

hydropatch commented 9 years ago

Thanks Basil,

My network connection in my building allows a maximum download of 3MB/s (it was reduced a few weeks ago after I crashed the system here downloading so much data from you guys!) So, we definitely won’t reach your maximum of 10MB/s. The only reason I like to download more than one file at a time is so that I don’t have to check and start new downloads as frequently. I will stick to downloading 2 or 3 at a time.

Thanks for your help.

Patrick

On Mar 19, 2015, at 11:11 AM, Basil Veerman notifications@github.com wrote:

Hi Patrick,

I have restarted the server processes and downloads are now functional.

With your requests, we can serve data at about 10MB/s, (100mbit) per connection. this would get you the ~7GB file in just over 10 minutes. I'm going to guess that the network connection between us would become a bottleneck long before we reach that maximum. If you start downloading one file and you are getting 10MB/s, then try 2. If you are not getting 10MB/s, then adding additional downloads will not actually get you data any faster.

Hope all goes well, Basil

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83697600.

jameshiebert commented 9 years ago

I suggest you use curl or wget to script your serial downloads. See http://tools.pacificclimate.org/dataportal/docs/pcds.html?highlight=cookie#advanced-programmatic-usage for details on how to manage the authentication with such tools.

hydropatch commented 9 years ago

Hi James and Basil,

We are also interested in obtaining shape files (polygons) that delineate the boundary of the modelled data (i.e. Campbell, Fraser, Upper columbia and Peace watershed boundaries). Are you able to make this data available to me?

Thanks Patrick

Patrick Little MSc, AAg

On Mar 19, 2015, at 11:11 AM, Basil Veerman notifications@github.com wrote:

Hi Patrick,

I have restarted the server processes and downloads are now functional.

With your requests, we can serve data at about 10MB/s, (100mbit) per connection. this would get you the ~7GB file in just over 10 minutes. I'm going to guess that the network connection between us would become a bottleneck long before we reach that maximum. If you start downloading one file and you are getting 10MB/s, then try 2. If you are not getting 10MB/s, then adding additional downloads will not actually get you data any faster.

Hope all goes well, Basil

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83697600.

AreliaTW commented 9 years ago

Hi Patrick,

I can provide those shape files. I'll get them to you by next Monday.

Arelia

hydropatch commented 9 years ago

Thanks Arelia.

On Mar 19, 2015, at 12:04 PM, AreliaTW notifications@github.com wrote:

Hi Patrick,

I can provide those shape files. I'll get them to you by next Monday.

Arelia

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83719624.

AreliaTW commented 9 years ago

Actually, here they are: http://www.pacificclimate.org/~wernera/VICGEN1_ShapeFiles/

There is one point shape file with the station locations and four polygon shape files (including all modelled sub-basins of the Campbell, Upper Columbia, Fraser and Peace).

Arelia

hydropatch commented 9 years ago

Awesome. Thanks Arelia.

On Mar 19, 2015, at 12:17 PM, AreliaTW notifications@github.com wrote:

Actually, here they are: http://www.pacificclimate.org/~wernera/VICGEN1_ShapeFiles/ http://www.pacificclimate.org/%7Ewernera/VICGEN1_ShapeFiles/ There is one point shape file with the station locations and four polygon shape files (including all modelled sub-basins of the Campbell, Upper Columbia, Fraser and Peace).

Arelia

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83724814.

hydropatch commented 9 years ago

Hello again

For some reason I got the 500 Internal Server Error for two of the stations. They dropped out mid download and when I went to restart, one of them seems to be downloading fine (starting from scratch) but the other continues to give the 500 Internal Server Error. So far 4 files have downloaded completely.

This one gives the 500 error: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CGCM3_A2_run1_19500101-20991231.nc.nc?swe This one failed mid download, but now seems to be downloading fine: http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_MIROC3.2_B1_run1_19500101-20991231.nc.nc?swe

FYI I am downloading files one at a time now.

Cheers Patrick

On Mar 19, 2015, at 11:20 AM, Patrick Little plittle@watersmith.ca wrote:

Thanks Basil,

My network connection in my building allows a maximum download of 3MB/s (it was reduced a few weeks ago after I crashed the system here downloading so much data from you guys!) So, we definitely won’t reach your maximum of 10MB/s. The only reason I like to download more than one file at a time is so that I don’t have to check and start new downloads as frequently. I will stick to downloading 2 or 3 at a time.

Thanks for your help.

Patrick

On Mar 19, 2015, at 11:11 AM, Basil Veerman <notifications@github.com mailto:notifications@github.com> wrote:

Hi Patrick,

I have restarted the server processes and downloads are now functional.

With your requests, we can serve data at about 10MB/s, (100mbit) per connection. this would get you the ~7GB file in just over 10 minutes. I'm going to guess that the network connection between us would become a bottleneck long before we reach that maximum. If you start downloading one file and you are getting 10MB/s, then try 2. If you are not getting 10MB/s, then adding additional downloads will not actually get you data any faster.

Hope all goes well, Basil

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83697600.

basilveerman commented 9 years ago

Hi Patrick,

I think we're hitting a bug in our data serving software separate to the issue of high load. I'm looking at the current activity and we have worker processes available to serve requests, but large requests for this dataset in particular are failing. Other datasets with files as large as 330GB appear to transfer fine.

This is a new issue for us and will likely take some time to track down.

In the mean time, I've set up a development site with this data here: http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/map/

You'll need to log in on the map page, get the new catalog here:

http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/map/

And retry those downloads using the same url patterns as before.

Being a development server, it is not as high performance as our production server and is more prone to crashing. Please download only one file at a time. If this crashes in the same manner, this data may be unavailable through our web portal until we figure out the issue.

hydropatch commented 9 years ago

Hi Basil,

Thanks for looking into this.

I think you pasted the map page twice. Do you have a different link for the new catalog page?

THanks Patrick

On Mar 19, 2015, at 4:52 PM, Basil Veerman notifications@github.com wrote:

Hi Patrick,

I think we're hitting a bug in our data serving software separate to the issue of high load. I'm looking at the current activity and we have worker processes available to serve requests, but large requests for this dataset in particular are failing. Other datasets with files as large as 330GB appear to transfer fine.

This is a new issue for us and will likely take some time to track down.

In the mean time, I've set up a development site with this data here: http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/map/ http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/map/ You'll need to log in on the map page, get the new catalog here:

http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/map/ http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/map/ And retry those downloads using the same url patterns as before.

Being a development server, it is not as high performance as our production server and is more prone to crashing. Please download only one file at a time. If this crashes in the same manner, this data may be unavailable through our web portal until we figure out the issue.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-83812517.

basilveerman commented 9 years ago

Whoops: http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/catalog/

hydropatch commented 9 years ago

Thanks Basil

On Mar 20, 2015, at 9:45 AM, Basil Veerman notifications@github.com wrote:

Whoops: http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/catalog/ http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/catalog/ — Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-84065547.

hydropatch commented 9 years ago

Hi Basil and James,

I have been pre-occupied with another project that had a deadline of today so I haven’t had a chance to use the .nc files that I downloaded from you until today. I have a few days to work on this project but now it looks like there is a problem with the data I downloaded.

Unfortunately the files seem to be corrupt. They will not open in R (nothing wrong with my code, I tested it with some other NetCDF files) or using ncdump.

Do you have any idea why this might be? I downloaded only the SWE variable. Do you think it has something to do with adding .nc?swe (e.g.. http://atlas.pcic.uvic.ca/dataportal/data/hydro_model_out/5var_day_CGCM3_B1_run1_19500101-20991231.nc.nc?swe http://atlas.pcic.uvic.ca/dataportal/data/hydro_model_out/5var_day_CGCM3_B1_run1_19500101-20991231.nc.nc?swe)

The files that are downloaded have a nc.nc ending which seemed strange. (eg. 5var_day_MIROC3.2_A2_run1_19500101-20981231.nc.nc) Is this unusual?

Can you suggest how to proceed?

Thanks Patrick

On Mar 20, 2015, at 9:45 AM, Basil Veerman notifications@github.com wrote:

Whoops: http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/catalog/ http://atlas.pcic.uvic.ca/dataportal/hydro_model_out/catalog/ — Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-84065547.

basilveerman commented 9 years ago

Hi Patrick,

I'm thinking this is related to the bug I mentioned earlier. I'll do some tests and confirm, and if so, we are currently testing a fix and will get that out as soon as possible.

hydropatch commented 9 years ago

Hi Basil,

FYI I just downloaded a random geographical slab from the web portal map, and the file that I downloaded is not corrupt.

Thanks

On Mar 27, 2015, at 2:46 PM, Basil Veerman notifications@github.com wrote:

Hi Patrick,

I'm thinking this is related to the bug I mentioned earlier. I'll do some tests and confirm, and if so, we are currently testing a fix and will get that out as soon as possible.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-87100954.

hydropatch commented 9 years ago

Hi Basil,

I realized that I am really only interested in 3 days per year for SWE for this data. Can you help me to write out an address to download a custom hyperslab?

For example: If I wanted to download SWE for the entire province for March 1 1950, April 1 1950, May 1 1950, March 1 1951, April 1 1951, May 1 1951 etc and I know that these dates correspond to timesteps 60,91,121,425,456,486.

Something like…

http:// http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe%5B91,456,822%5D%5Balltools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe[ http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe%5Bc60,91,121,425,456,486][ http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CCSM3_A2_run1_19500101-20991231.nc.nc?swe%5B91,456,822%5D%5Ball0:163][0:215]

I realize I’m getting the 60,91,121,425,456,486 part wrong - can you let me know how I would do this? Or is it possible?

Thanks Patrick

On Mar 27, 2015, at 2:46 PM, Basil Veerman notifications@github.com wrote:

Hi Patrick,

I'm thinking this is related to the bug I mentioned earlier. I'll do some tests and confirm, and if so, we are currently testing a fix and will get that out as soon as possible.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-87100954.

basilveerman commented 9 years ago

The data server conforms to the OpenDAP specification. As such, you can query and slice using standard operators. Here is a basic guide. AFAIK, you won't be able to request individual days, but you could request something like [60:62:54786]. Take a look at the section about 'stride' values.

hydropatch commented 9 years ago

Hi Basil,

I have been trying to download the data using wget script (see below). I am interested in only one date per year for this batch. Unfortunately, the script seems to work for only few years (loops) and then the servers at PCIC seem to crash? The downloading seems to stop working and when I load the Hydrologic data map data portal page in a browser it can’t be found. I notice that after a few minutes it seems to restart itself.

Any thoughts?

Patrick

dayslist='91 456 822 1187 1552 1917 2283 2648 3013 3378 3744 4109 4474 4839 5205 5570 5935 6300 6666 7031 7396 7761 8127 8492 8857 9222 9588 9953 10318 10683 11049 11414 11779 12144 12510 12875 13240 13605 13971 14336 14701 15066 15432 15797 16162 16527 16893 17258 17623 17988 18354 18719 19084 19449 19815 20180 20545 20910 21276 21641 22006 22371 22737 23102 23467 23832 24198 24563 24928 25293 25659 26024 26389 26754 27120 27485 27850 28215 28581 28946 29311 29676 30042 30407 30772 31137 31503 31868 32233 32598 32964 33329 33694 34059 34425 34790 35155 35520 35886 36251 36616 36981 37347 37712 38077 38442 38808 39173 39538 39903 40269 40634 40999 41364 41730 42095 42460 42825 43191 43556 43921 44286 44652 45017 45382 45747 46113 46478 46843 47208 47574 47939 48304 48669 49035 49400 49765 50130 50496 50861 51226 51591 51957 52322 52687 53052 53418 53783 54148 54513 54879'

currentyear=1950

for numdays in $dayslist; do

wget --output-document=CSIROAPR1$currentyear.nc --header "Cookie: beaker.session.id=0fd0339b566c4d9b940ed5321c7bf872" http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc.nc?swe[$numdays][0:163][0:215] 2> /dev/null

currentyear=$(($currentyear+1))

done

On Mar 27, 2015, at 3:27 PM, Basil Veerman notifications@github.com wrote:

The data server conforms to the OpenDAP specification. As such, you can query and slice using standard operators. Here is a basic guide http://docs.opendap.org/index.php/QuickStart. AFAIK, you won't be able to request individual days, but you could request something like [60:62:54786]. Take a look at the section about 'stride' values.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-87107936.

hydropatch commented 9 years ago

Hi again,

I just wanted to inform you that I downloaded the entire datset "http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc.nc?swe http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc.nc?swe” and the data is corrupt.

Thanks for your help sorting this out. Please keep me in the loop regarding options to get this data.

Thanks

Patrick

On Mar 30, 2015, at 10:39 AM, Patrick Little plittle@watersmith.ca wrote:

Hi Basil,

I have been trying to download the data using wget script (see below). I am interested in only one date per year for this batch. Unfortunately, the script seems to work for only few years (loops) and then the servers at PCIC seem to crash? The downloading seems to stop working and when I load the Hydrologic data map data portal page in a browser it can’t be found. I notice that after a few minutes it seems to restart itself.

Any thoughts?

Patrick

dayslist='91 456 822 1187 1552 1917 2283 2648 3013 3378 3744 4109 4474 4839 5205 5570 5935 6300 6666 7031 7396 7761 8127 8492 8857 9222 9588 9953 10318 10683 11049 11414 11779 12144 12510 12875 13240 13605 13971 14336 14701 15066 15432 15797 16162 16527 16893 17258 17623 17988 18354 18719 19084 19449 19815 20180 20545 20910 21276 21641 22006 22371 22737 23102 23467 23832 24198 24563 24928 25293 25659 26024 26389 26754 27120 27485 27850 28215 28581 28946 29311 29676 30042 30407 30772 31137 31503 31868 32233 32598 32964 33329 33694 34059 34425 34790 35155 35520 35886 36251 36616 36981 37347 37712 38077 38442 38808 39173 39538 39903 40269 40634 40999 41364 41730 42095 42460 42825 43191 43556 43921 44286 44652 45017 45382 45747 46113 46478 46843 47208 47574 47939 48304 48669 49035 49400 49765 50130 50496 50861 51226 51591 51957 52322 52687 53052 53418 53783 54148 54513 54879'

currentyear=1950

for numdays in $dayslist; do

wget --output-document=CSIROAPR1$currentyear.nc --header "Cookie: beaker.session.id=0fd0339b566c4d9b940ed5321c7bf872" http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc.nc?swe[$numdays][0:163][0:215] http://tools.pacificclimate.org/dataportal/hydro_model_out/data/5var_day_CSIRO35_A2_run1_19500101-20981231.nc.nc?swe[$numdays][0:163][0:215] 2> /dev/null

currentyear=$(($currentyear+1))

done

On Mar 27, 2015, at 3:27 PM, Basil Veerman <notifications@github.com mailto:notifications@github.com> wrote:

The data server conforms to the OpenDAP specification. As such, you can query and slice using standard operators. Here is a basic guide http://docs.opendap.org/index.php/QuickStart. AFAIK, you won't be able to request individual days, but you could request something like [60:62:54786]. Take a look at the section about 'stride' values.

— Reply to this email directly or view it on GitHub https://github.com/pacificclimate/pdp/issues/21#issuecomment-87107936.

basilveerman commented 9 years ago

Hi Patrick,

That script would have worked for the first 10 years and then we would have hit the bug in our system we have been talking about. We developed a fix and have updated the portal to version 2.2.5 which fixes this issue.

Regarding corrupted data: I'll recreate your requests and take a look into it. If I see the same results I will open a separate issue to address that.

Cheers, Basil