Closed bevingtona closed 9 months ago
Current usage logging allows for monitoring processing speeds..
processing time estimate is somewhat linear by watershed area (but depends on amount of cutblocks, roads, etc. in watershed)..
Intercept = 0.306740 minutes
Slope = 0.000257 minutes/km2
Call:
lm(formula = time ~ area_km2, data = dfp)
Residuals:
Min 1Q Median 3Q Max
-0.77953 -0.15724 -0.06444 0.16259 0.87388
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 3.067e-01 3.948e-02 7.77 4.47e-11 ***
area_km2 2.570e-04 1.491e-05 17.24 < 2e-16 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
df <- dbReadTable(conn, "usage")
dfp <- df %>%
as_tibble() %>%
mutate(time = case_when(str_detect(processing_time, "minute") ~ as.numeric(str_replace(processing_time, " minutes elapsed", "")),
str_detect(processing_time, "sec") ~ as.numeric(str_replace(processing_time, " sec elapsed", ""))/60)) %>%
filter(action == "processing")
dfp %>% ggplot(aes(area_km2, time)) +
geom_point() +
geom_smooth(method = "lm") +
theme_bw() + theme(aspect.ratio = 1) +
scale_x_continuous(n.breaks = 10) +
scale_y_continuous(n.breaks = 10) +
labs(x = bquote(Area~(km^2)),
y = "Minutes to Process")
summary(lm(time~area_km2, data = dfp))
Currently logging activity by session id to database... perhaps add https://appsilon.github.io/shiny.telemetry/