kenkoonwong / everyday_is_a_school_day

0 stars 0 forks source link

blog/frontdoor/ #3

Open utterances-bot opened 1 year ago

utterances-bot commented 1 year ago

Front-door Adjustment | Everyday Is A School Day

Front-door adjustment: a superhero method for handling unobserved confounding by using mediators (if present) to estimate causal effects accurately

https://www.kenkoonwong.com/blog/frontdoor/

joscani commented 1 year ago

Great post. In bayesian way. U , unobserved confounder could be seen as missing data, and consider estimate every value of U as a parameter to estimate.


library(cmdstanr)
library(rethinking)

dat <- list(
    N = nrow(df),
    Y = df$y,
    X = df$x,
    Z = df$z
)

flbi <- ulam(
    alist(
        # y model, only near arrow
        Y ~ normal(mu_y, sigma_y), 
        mu_y <-  a_y + b_z * Z + k_1 * U[i], 

        # Z model 
        Z ~ normal(mu_z, sigma_z), 
        mu_z <- a_z + b_x * X, 

        # X Model 
        X ~ normal(mu_x, sigma_x), 
        mu_x <-  a_x + k_2 * U[i], 

        # unmeasured confound, Note U is not in dat list. model
        # estimate 1000 parameters
        vector[N]:U ~ normal(0,1),

        c(a_y,b_z, k_1, a_z, b_x, a_x, k_2) ~ normal( 0 , 1 ),
        c(sigma_y,sigma_z, sigma_x) ~ exponential( 1 )

    ), data=dat , chains=4 , cores=4 , warmup = 500, iter=2500 , cmdstan=TRUE )

precis(flbi)
1000 vector or matrix parameters hidden. Use depth=2 to show them.
         mean   sd  5.5% 94.5% n_eff Rhat4
k_2     -0.47 0.28 -1.02 -0.19     7  1.64
a_x      0.00 0.04 -0.06  0.05  1145  1.00
b_x      0.64 0.03  0.60  0.69  1402  1.00
a_z      0.02 0.03 -0.04  0.07  3104  1.00
k_1     -0.78 0.36 -1.21 -0.25     4  1.94
b_z     -0.48 0.04 -0.54 -0.42   930  1.00
a_y     -0.01 0.04 -0.07  0.05   598  1.01
sigma_x  0.96 0.22  0.47  1.13    10  1.55
sigma_z  1.03 0.02  0.99  1.07  2475  1.00
sigma_y  0.79 0.36  0.25  1.20     3  2.33

Make a do analysis is like an intervention. We make an intervention using the posteriors.

post <-  extract.samples(flbi)

# Intervertion. X = 0 and get posterior follow the dag
# X = 0
#
Z_x_0 <- with( post ,  a_z + b_x * 0 )

Y_x_0 <-  with(post, a_y + b_z * Z_x_0 )

# Intervention X= 1
# X = 1
#
Z_x_1 <- with( post ,  a_z + b_x * 1 )

Y_x_1 <-  with(post, a_y + b_z * Z_x_1 )

# difference to get causal effect 
Effect_x_over_y <- Y_x_1 - Y_x_0
quantile(Effect_x_over_y)
        0%        25%        50%        75%       100% 
-0.4188472 -0.3281747 -0.3093903 -0.2907799 -0.2187151

Plot the posteriors

plot(bayestestR::hdi(Effect_x_over_y, ci = c( 0.80, 0.95))) +
    labs(title = "Front door, in bayesian way")
kenkoonwong commented 1 year ago

Interesting! thanks for sharing. I'll have to take a look!

joscani commented 1 year ago

Some fix. Considering same coef of U over Y and over X, and k_1 ~ exponential(1) , k_1 positive. This specification converges quickly and Rhat is 1 for all coefficients.



flbi <- ulam(
    alist(
        # y model, only near arrow
        Y ~ normal(mu_y, sigma_y), 
        mu_y <-  a_y + b_z * Z + k_1 * U[i], 

        # Z model 
        Z ~ normal(mu_z, sigma_z), 
        mu_z <- a_z + b_x * X, 

        # X Model 
        X ~ normal(mu_x, sigma_x), 
        mu_x <-  a_x + k_1 * U[i], 

        # unmeasured confound, Note U is not in dat list. model
        # estimate 1000 parameters
        vector[N]:U ~ normal(0,1),

        c(a_y,b_z, a_z, b_x, a_x) ~ normal( 0 , 1 ),
        c(k_1, sigma_y,sigma_z, sigma_x) ~ exponential( 1 )

    ), data=dat , chains=4 , cores=4 , warmup = 500, iter=2500 , cmdstan=TRUE )

precis(flbi)
1000 vector or matrix parameters hidden. Use depth=2 to show them.
         mean   sd  5.5% 94.5% n_eff Rhat4
a_x      0.00 0.04 -0.06  0.05  9417     1
b_x      0.64 0.03  0.60  0.69 13210     1
a_z      0.02 0.03 -0.04  0.07 15549     1
b_z     -0.48 0.04 -0.54 -0.42  2171     1
a_y     -0.01 0.04 -0.07  0.06  8633     1
sigma_x  1.00 0.03  0.94  1.05  1593     1
sigma_z  1.03 0.02  0.99  1.07 18179     1
sigma_y  1.10 0.03  1.05  1.16  2610     1
k_1      0.52 0.05  0.43  0.60  1037     1```