HARPgroup / HARParchive

This repo houses HARP code development items, resources, and intermediate work products.
1 stars 0 forks source link

Script to Export river h5s #312

Open rburghol opened 2 years ago

rburghol commented 2 years ago

Note: This has been updated to a generic script: export_hsp_h5.R

juliabruneau commented 2 years ago

Suggestions From HSPF Manual:

image

I'm most unsure about ps_mgd . The IVOL variable is the only one in the HYDR table that has anything to do with water entering the river segment.

These are also in the HYDR/table:

O1 - zeros O2 - zeros O3 - values (Rates of outflow from individual exits)

OVOL1 - zeros OVOL2 - zeros OVOL3 - values (Volume of outflow from individual exits)

juliabruneau commented 2 years ago

Data Harvesting Script for River Segments

Separate harvesting script created for river segments: batch_harvest_river.bat

Use: /opt/model/p53/p532c-sova$ bash ~/HARParchive/HARP-2022-Summer/AutomatedScripts/batch_harvest_river.bat hsp2_2022 OR1_7700_7980

H5 file path = $CBP_EXPORT_DIR/river/$scenario_name/h5/$basin'.h5' Output file path = $CBP_EXPORT_DIR/river/$scenario_name/hydr/$basin

Ex:

juliasb@deq2:/opt/model/p53/p532c-sova$ bash ~/HARParchive/HARP-2022-Summer/AutomatedScripts/batch_harvest_river.bat hsp2_2022 OR1_7700_7980
Loading /opt/model/p53/p532c-sova/hspf.config
CBP_ROOT: /opt/model/p53/p532c-sova
CBP_EXPORT_DIR: /media/model/p532/out
Warning message:
In H5Dread(did, bit64conversion = "double") :
  integer precision lost while converting 64-bit integer from HDF5 to double in R.
Choose bit64conversion='bit64' to avoid data loss and see the vignette 'rhdf5' for more details about 64-bit integers.
hydr csv created

Found in:

juliasb@deq2:/media/model/p532/out/river/hsp2_2022/hydr$ ls
OR1_7700_7980_hydr.csv
rburghol commented 2 years ago

@juliabruneau can you paste a sample of the first few lines of the output CSV file into the issue main body, so we can see the columns available and sample data? The head command is useful for that, so, something like: head /media/model/p532/out/river/hsp2_2022/hydr/OR1_7700_7980_hydr.csv should spit out the top 5-10 lines (FYI: tail does just the opposite, showing the last few lines of a file)

Also, I think:

glenncampagna commented 2 years ago

Findings about point source in river h5 file

rburghol commented 2 years ago

Ok, there but empty us good. One thing we'll need to do is insure that we have a segment that SHOULD have point sources, or withdrawals for that matter. I'll try to find one! For now, get the aliases in there and focus on batch script.

glenncampagna commented 2 years ago

Testing our R script which will perform analysis and send exports from hydr table to VAhydro

Script: hsp_hydr.R https://github.com/HARPgroup/HARParchive/blob/master/HARP-2022-Summer/AutomatedScripts/hsp_hydr.R

rburghol commented 2 years ago

Two thjngs:

Keep me posted!

glenncampagna commented 2 years ago

We have our hsp_hydr.R script working and theoretically sending constants and graphs soon to VAhydro, but aren't sure where to check on these because we don't see an existing river feature in VAHydro for OR1_7700_7980. This was why we'd thought to save the river segment originally @rburghol We only see the features A51011_OR1...(HUC8) and A51037_OR1...(HUC8)

rburghol commented 2 years ago

Awesome work today. To find the River segment OR1_7700_7980, check with @megpritch and @nicoledarling as they just went through this process.

Since we have to wait to get the point source abd withdrawals sorted out, let's move forward on the flow analysis -- if you haven't already done that! The R script above, "waterSupplyModelNode.R" has many metrics that it calculates as a function of the Qout variable. If you can replicate all of those that would be excellent.

glenncampagna commented 2 years ago

We modified our hsp_hydr.R script to create a model run within OR1_7700_7980 - Club Creek. We had to change the type from vahydro to cbp532 to do this. Should we have called the feature with hydrocode vahydrosw_wshed_OR1_7700_7980 and the vahydro ftype instead?

image

We are in the process of incorporating the Qout metrics into our script and testing

juliabruneau commented 2 years ago

Update on hsp_hydr.R flow analysis:

Based on waterSupplyModelNode.R:

Exported Constants: Mean Qout - Qout_mean 30-day Low Flow - l30_Qout 90-day Low Flow - l90_Qout August Low Flow - alf September 10th Percentile - sept_10 7-day Low Flow Over 10 Years - x7q10 Output File Path - /media/model/p532/out/river/hsp2_2022/hydr/[river_segment_name]_hydr.csv

Planned Exported Plots: Qout vs Time

Potential Exported Constants (after calculating wd/ps_cumulative): Mean Withdrawals - wd_mgd Mean Point Source - ps_mgd net_consumption_mgd = wd_cumulative - ps_cumulative Qbaseline = Qout + (ws_cumulative - ps_cumulative) consumptive_use_frac = 1.0 - (Qout / Qbaseline) daily_consumptive_use_fraction = mean(consumptive_use_frac)

glenncampagna commented 2 years ago

We added a container for Qout in VAhydro for better organization:

image

Here is the figure of Qout we generated:

image

It might help us out to remove the existing scenario hsp2_2022 from this river seg because there are some duplicates from testing, then we can run and have everything in the right place and no duplicates Link to scenario: http://deq1.bse.vt.edu/d.dh/om-model-info/6850637/dh_properties

juliabruneau commented 2 years ago

@rburghol Should we export the updated constants and graph into the cbp532 or vahydro ftype?

glenncampagna commented 2 years ago

Updated use for script to analyze river seg hydr files:

Script: hsp_hydr.R https://github.com/HARPgroup/HARParchive/blob/master/HARP-2022-Summer/AutomatedScripts/hsp_hydr.R Use: Rscript hsp_hydr.R [river seg] [scenario name] [scenario folder path to locate hydr csv] Example: Rscript HARParchive/HARP-2022-Summer/AutomatedScripts/hsp_hydr.R OR1_7700_7980 hsp2_2022 /media/model/p532/out/river/hsp2_2022

rburghol commented 2 years ago

@juliabruneau @glenncampagna Couple things -- the adding of Qout aliases etc should be done separately from the analysis step, so can you:

The separate scripts ask is cause I'd like the Qout and other columns to always be there for any script to operate on, and thus we wont have to do multiple conversions if we access then with different analytical routines.

juliabruneau commented 2 years ago

@juliabruneau @glenncampagna Couple things -- the adding of Qout aliases etc should be done separately from the analysis step, so can you:

  • split that out into another script that 1) reads the file, 2) creates the aliases/unit conversion, and 3) saves the file.
  • move the analysis to a totally separate script
  • Change the unit conversion for cfs to MGD to 1.547, rather than 1.55, it's what we use everywhere else so we can be consistent.

The separate scripts ask is cause I'd like the Qout and other columns to always be there for any script to operate on, and thus we wont have to do multiple conversions if we access then with different analytical routines.

Working on creating the two scripts - one for creating the aliases and one for the analysis.

Just to clarify @rburghol :

juliabruneau commented 2 years ago

Two Rscripts for River Segments:

Name: hsp_hydr_conversion.R Use: Rscipt hsp_hydr_conversion.R [river seg] [scenario name] [scenario folder path to locate hydr csv] Ex: Rscript HARParchive/HARP-2022-Summer/AutomatedScripts/hsp_hydr_conversion.R OR1_7700_7980 hsp2_2022 /media/model/p532/out/river/hsp2_2022 Output: CSV files with added column for wanted unit conversion (proof below - ROVOL_cfs at the end:)

  GNU nano 4.8                                                                                  OR1_7700_7980_hydr.csv
"index","DEP","IVOL","O1","O2","O3","OVOL1","OVOL2","OVOL3","PRSUPY","RO","ROVOL","SAREA","TAU","USTAR","VOL","VOLEV","date","week","month","year","ROVOL_cfs"
1984-01-01 01:00:00,0.24150724709034,8.84726428985596,0,0,2.05518269538879,0,0,0.122939802706242,0,2.05518269538879,0.122939802706242,67.4736633300781,0.0234425533562899,0

Name: hsp_hydr_analysis.R Use: Rscipt hsp_hydr_analysis.R [river seg] [scenario name] [scenario folder path to locate hydr csv] Ex: Rscript HARParchive/HARP-2022-Summer/AutomatedScripts/hsp_hydr_analysis.R OR1_7700_7980 hsp2_2022 /media/model/p532/out/river/hsp2_2022 Ouput: Constants and graphs into VAHydro

rburghol commented 2 years ago

Hey @glenncampagna @juliabruneau I am trying to run the summary script for river segments, but this is not working for Rockfish. I want to integrate these into the run_river.csh script, but will hold off until I can get this to work. I have pulled the HARP Archvie repository at 9:15 AM, so I think it is the most recent code.

Tried:

cd /opt/model/p53/p532c-sova
. hspf_config
Rscript $CBP_ROOT/run/export/hsp_hydr_analysis.R JL2_6850_6890 hsp2_2022 $CBP_EXPORT_DIR/river/hsp2_2022`

Help?

Integrated:

cp ../../HARParchive/HARP-2022-Summer/AutomatedScripts/hsp_hydr_analysis.R ./run/export/
juliabruneau commented 2 years ago

Hey @glenncampagna @juliabruneau I am trying to run the summary script for river segments, but this is not working for Rockfish. I want to integrate these into the run_river.csh script, but will hold off until I can get this to work. I have pulled the HARP Archvie repository at 9:15 AM, so I think it is the most recent code.

Tried:

cd /opt/model/p53/p532c-sova
. hspf_config
Rscript $CBP_ROOT/run/export/hsp_hydr_analysis.R JL2_6850_6890 hsp2_2022 $CBP_EXPORT_DIR/river/hsp2_2022`

Help?

Integrated:

cp ../../HARParchive/HARP-2022-Summer/AutomatedScripts/hsp_hydr_analysis.R ./run/export/

We ran the Rscript like you did, and found the error 'no rows to aggregate'. This is because we want to do our analysis script in cfs, and that means that we use the conversion script that generates Qout (cfs). The river seg didn't have the conversion script ran on it for some reason.

We ran the conversion on the hydr.csv now and that should fix the issue.

Note: the way the scripts are structured, the hydr conversion script should be run before the hydr analysis script is run

rburghol commented 2 years ago

Oh!

The river seg didn't have the conversion script ran on it for some reason.

That's cause I didn't include that in the run_river.csh work flow - thanks!