All of these Bayesian packages would allow us to do some more sophisticated analyses such as combining information from multiple data sources, imposing an explicit hierarchy for parameter variability, and gapfilling missing data with uncertainty.
I spent a little bit of time playing with this today. Here's a first-pass at a Nimble model (which, with minor tweaking, should also work with JAGS):
# Process model -- random walk
gpp[1] ~ dnorm(0, 0.1)
for (t in 2:nT) {
gpp[t] ~ dnorm(gpp[t-1], tau_rw)
}
# Data model
for (t in 1:nT) {
r[t] <- r_chamber[t] * (1 + Rho / Beta - Rho)
nep_pred[t] <- gpp_finality[t] - r[t]
nep[t] ~ dnorm(nep_pred[t], tau_nep)
gpp_finality[t] ~ dnorm(gpp[t], tau_finality)
}
# Priors
tau_rw ~ dgamma(0.1, 0.1)
Rho ~ dbeta(rho_a, rho_b)
Beta ~ dnorm(Beta_a, Beta_b)
Here's a slightly more sophisticated version that runs way slower, but reformulates our entire model in terms of observation operators, which is more in line with a typical Bayesian state-space framework:
# Process model -- random walk
gpp[1] ~ T(dnorm(0, 0.1), 0, Inf)
r[1] ~ T(dnorm(0, 0.1), 0, Inf)
for (t in 2:nT) {
gpp[t] ~ T(dnorm(gpp[t-1], tau_rw), 0, Inf)
r[t] ~ T(dnorm(r[t-1], tau_rw), 0, Inf)
}
nep[1:nT] <- gpp[1:nT] - r[1:nT]
rb_term <- 1 + Rho / Beta - Rho
rsoil[1:nT] <- r[1:nT] / rb_term
# Data model
for (t in 1:nT) {
rsoil_obs[t] ~ dnorm(rsoil[t], tau_rsoil)
nep_obs[t] ~ dnorm(nep[t], tau_nep)
}
# Priors
tau_rw ~ dgamma(0.1, 0.1)
Rho ~ dbeta(rho_a, rho_b)
Beta ~ dnorm(Beta_a, Beta_b)
In both versions, I use a random walk as a process model, but we could consider something slightly more mechanistic like a simple light use efficiency model; for instance, what I did for a class project in grad school: https://github.com/EcoForecast/GPP/blob/master/gpp.lue.simple.bug.
Doing all this would be somewhat more involved, but could ultimately lead to more interesting results. Food for thought, anyway -- maybe, this could be a follow-up paper (once we actually finish this one, which we still have to do!).
All of these Bayesian packages would allow us to do some more sophisticated analyses such as combining information from multiple data sources, imposing an explicit hierarchy for parameter variability, and gapfilling missing data with uncertainty.
I spent a little bit of time playing with this today. Here's a first-pass at a Nimble model (which, with minor tweaking, should also work with JAGS):
Here's a slightly more sophisticated version that runs way slower, but reformulates our entire model in terms of observation operators, which is more in line with a typical Bayesian state-space framework:
In both versions, I use a random walk as a process model, but we could consider something slightly more mechanistic like a simple light use efficiency model; for instance, what I did for a class project in grad school: https://github.com/EcoForecast/GPP/blob/master/gpp.lue.simple.bug.
Here's a full (semi-functional) script with the second model (the first one is in the previous commit): https://github.com/ashiklom/rubisco-gpp/blob/nimble/howland-ts-nimble.R
Doing all this would be somewhat more involved, but could ultimately lead to more interesting results. Food for thought, anyway -- maybe, this could be a follow-up paper (once we actually finish this one, which we still have to do!).