HARPgroup / HARParchive

This repo houses HARP code development items, resources, and intermediate work products.
1 stars 0 forks source link

REST / RomDataSource, metrics and analysis #1354

Open rburghol opened 1 month ago

rburghol commented 1 month ago

Analysis Steps

Overview

Data Model & REST

Data Model Outline

Detail View of REST Feature Query
feature <- RomFeature$new(
  ds,
  list(
    hydrocode=coverage_hydrocode,
    ftype=coverage_ftype,
    bundle=coverage_bundle
  ),
  TRUE
)

Using om_vahydro_metric_grid to retrieve data


# GET VAHydro 1.0 RIVERSEG l90_Qout DATA
mdf <- data.frame(
  'model_version' = c('cbp-6.1'),
  'runid' = c('stormVol_prism'),
  'metric' = c('precip_annual_max_in'),
  'runlabel' = c('precip_annual_max_in')
)
met_data <- om_vahydro_metric_grid(
  metric = metric, runids = mdf, bundle="watershed", ftype="usgs_full_drainage",
  base_url = paste(site,'entity-model-prop-level-export',sep="/"),
  ds = ds
)
nathanielf22 commented 5 days ago

Here is the current function that I updated a bit tonight. This works for any PRISM, daymet, and NLDAS2 data, such as: read.csv("http://deq1.bse.vt.edu:81/met/stormVol_prism/precip/usgs_ws_01613900-PRISM-all.csv")

When calculating a metric, you can run something like this to return the lowest 90-day flow: summary_analytics(prism)$l90_precip_in

`library(tidyr) library(sqldf) library(zoo)

For ANY data

summary_analytics <- function(df){ df <- separate (data = df, col = obs_date, into = c("obs_date","obs_time"), sep = " ") df <- separate (data = df, col = obs_date, into = c("obs_year","obs_month","obs_day"), sep = "-")

creating a yearly summary with each year and its total precip

yearly.summary <- sqldf( "SELECT obs_year, SUM(precip_in) AS total_precip FROM df GROUP BY obs_year" )

summary analytics

precip_annual_max_in <- max(yearly.summary$total_precip)

precip_annual_max_year <- yearly.summary$obs_year[which.max(yearly.summary$total_precip)]

precip_annual_mean_in <- mean(yearly.summary$total_precip)

For min values and years, we can exclude the first and last row since the

current year and first years are incomplete data

precip_annual_min_in <- min(yearly.summary$total_precip[c(-nrow(yearly.summary),-1)]) precip_annual_min_year <- yearly.summary$obs_year[which.min (yearly.summary$total_precip[c(-nrow (yearly.summary),-1)])]

Create daily summary to use for all data. This makes hourly data daily sums.

daily.summary <- sqldf( "SELECT obs_year, obs_month, obs_day, SUM(precip_in) AS total_precip FROM df GROUP BY obs_year, obs_month, obs_day" ) precip_daily_max_in <- max(daily.summary$total_precip)

if else statement evaluates the amount of unique hours and if 24,

then hourly max is taken. If not, hourly max is NA

if(length(unique(df$hr)) == 24){ precip_hourly_max_in <- max(df$precip_in) } else { precip_hourly_max_in <- NA }

Alternatively to a null value for hourly precip in daily data,

we could look at the rainfall distribution table for a

type II storm and multiply by the max P(t)/P(24) value.

this function uses NULL for daily data

l90 using zoo package

l90_precip_in <- min(rollapply(daily.summary$total_precip, width = 90, FUN = mean, fill = NA, align = "right"), na.rm = TRUE)

Qout_zoo <- zoo(as.numeric(df$precip_in), order.by = df$tstime)

Qout_g2 <- data.frame(group2(Qout_zoo))

l90 done using zoo package. IHA did not work.

makes data frame with all 8 metrics

metrics<- data.frame(precip_annual_max_in = precip_annual_max_in, precip_annual_max_year = precip_annual_max_year, precip_annual_mean_in = precip_annual_mean_in, precip_annual_min_in = precip_annual_min_in, precip_annual_min_year = precip_annual_min_year, precip_daily_max_in = precip_daily_max_in, precip_hourly_max_in = precip_hourly_max_in, l90_precip_in = l90_precip_in) }`

I can run through this in detail next week since I know this is a lot to look at. I'll be out of town tomorrow afternoon through Sunday and likely out of service, but I will do my best to respond before I leave or after I get back this weekend. In the meantime, I wanted to make sure this is stored here.

rburghol commented 4 days ago

@nathanielf22 hey Nate, thanks a bunch for pushing this up before you headed out. Very much look forward to working with this, and congratulations on the progress.