Closed nehill197 closed 9 months ago
The goal is to get discharge at weekly intervals for these sites from 2019-2021.
a bit of background about how bcfishpass
does an annual estimate for all stream crossing locations where the data is available
https://github.com/smnorris/bcfishpass/tree/main/model/02_habitat_linear/discharge
data used comes from https://www.pacificclimate.org/data/gridded-hydrologic-model-output
this script does most the work of data download and pulling out info using cdo
https://github.com/smnorris/bcfishpass/blob/main/model/02_habitat_linear/discharge/Makefile
might be a bit of gymnastics to get weekly but I'll touch base with Norris and see what he says.
We have estimates of discharge for the entire Fraser but the dataset of values built from historic data inputs to the VIC model with real world input data go only until end of 2012.
We can get estimated discharge for 2019 - 2021 from projected data. If we wanted to use this projected modelled data for the time period of our existing water temp dataset we would need to pick one of global climate models and an emissions scenario. Attached (derived with cdo
cmd line program installed with brew
on mac) is what the data looks like for the second generation Canadian Earth System Model (CanESM2) model and the RCP 4.5 climate change scenario
We can compute weekly averages for both the historic and projected datasets. Perhaps neither of these options is really that great
cdo info allwsbc.CanESM2_rcp45_r1i1p1.1945to2099.BASEFLOW.nc.nc > info_baseflow.txt
cdo info allwsbc.CanESM2_rcp45_r1i1p1.1945to2099.RUNOFF.nc.nc > info_runoff.txt
As a place to put this. Climate scenarios explained here:
https://journals.ametsoc.org/view/journals/clim/28/3/jcli-d-14-00636.1.xml?tab_body=pdf
selected ACCESS1-0
as it has only 1 model contributor. Apparently for the 50 years the emmissions scenario shouldn't make much diff so chose rcp85
CanESM2 is from Canadian Centre for Climate Modeling and Analysis but has 5 model contributors
with the help of chatGPT4 and the PCIC help pages used this call to get the URL
wget --output-document=- https://data.pacificclimate.org/portal/hydro_model_out/catalog/catalog.json 2> /dev/null | jq -r 'to_entries[] | select(.key | test("BASEFLOW";"i") and test("CanESM2_rcp45";"i")) | .value'
https://data.pacificclimate.org/data/hydro_model_out/allwsbc.CanESM2_rcp45_r1i1p1.1945to2099.BASEFLOW.nc
same thing in R is
library(jsonlite)
library(stringr)
# Read the JSON data into R
json_data <- fromJSON("https://data.pacificclimate.org/portal/hydro_model_out/catalog/catalog.json")
# Filter the data
filtered_data <- json_data[str_detect(names(json_data), "BASEFLOW") & str_detect(names(json_data), "CanESM2_rcp45")]
result is
$`BASEFLOW_day_VICGL_CanESM2_rcp45_r1i1p1_19450101-20991231_columbia`
[1] "https://data.pacificclimate.org/data/hydro_model_out/allwsbc.CanESM2_rcp45_r1i1p1.1945to2099.BASEFLOW.nc"
Tried to get some info about the dataset on the cmd line that might help figure out how to describe the bounding box
wget --output-document=- https://data.pacificclimate.org/portal/hydro_model_out/allwsbc.CanESM2_rcp45_r1i1p1.1945to2099.BASEFLOW.nc 2> /dev/null
likely because this is the portal
and not the data
and also not the Dataset Descriptor Structure this is not giving us info about the variables (didn't read the instructions carefully) - but does give info including this little gem:
bc_basemap_url: 'https://services.pacificclimate.org/tiles/bc-albers-lite/${z}/${x}/${y}.png
cool.
tried modifying to look like the help doc example as so:
$ curl 'https://data.pacificclimate.org/data/hydro_model_out/allwsbc.CanESM2_rcp45_r1i1p1.1945to2099.BASEFLOW.nc.dds'
Dataset {
Grid {
Array:
Int16 BASEFLOW[time = 56613][lat = 367][lon = 496];
Maps:
Float64 time[time = 56613];
Float64 lat[lat = 367];
Float64 lon[lon = 496];
} BASEFLOW;
Float64 lat[lat = 367];
Float64 lon[lon = 496];
Float64 time[time = 56613];
} allwsbc%2ECanESM2_rcp45_r1i1p1%2E1945to2099%2EBASEFLOW%2Enc;
Tried the following to dig a bit deeper to see how these indices map to date and lat/long
airvine at Allans-MacBook-Pro in ~/Projects/repo
$ wget --output-document=- https://data.pacificclimate.org/portal/hydro_model_out/catalog/BASEFLOW_day_VICGL_CanESM2_rcp45_r1i1p1_19450101-20991231_columbia.nc.dds
WARNING: timestamping does nothing in combination with -O. See the manual
for details.
--2024-02-05 23:13:56-- https://data.pacificclimate.org/portal/hydro_model_out/catalog/BASEFLOW_day_VICGL_CanESM2_rcp45_r1i1p1_19450101-20991231_columbia.nc.dds
Resolving data.pacificclimate.org (data.pacificclimate.org)... 206.12.89.147
Connecting to data.pacificclimate.org (data.pacificclimate.org)|206.12.89.147|:443... connected.
HTTP request sent, awaiting response...
HTTP/1.1 404 Not Found
server: gunicorn/19.8.1
date: Tue, 06 Feb 2024 07:13:56 GMT
content-type: text/html; charset=UTF-8
content-length: 159
2024-02-05 23:13:56 ERROR 404: Not Found.
I would have expected this was due to an error with my syntax but the example in the help docgives a 404
as well
Time to switch to R...
Getting close - https://github.com/NewGraphEnvironment/fish-passage-22/blob/read-discharge-pcic/read-discharge-pcic.R
TBC
points_to_get_discharge.xlsx