vdorie / dbarts

Discrete Bayesian Additive Regression Trees Sampler
56 stars 21 forks source link

rngSeed doesn't work for binary BART with multiple threads #39

Open ngreifer opened 3 years ago

ngreifer commented 3 years ago

Hi Vincent. Thank you so much for taking measures to make dbarts more reproducible. Unfortunately I found a bug where setting the seed doesn't seem to work with binary outcomes when using multiple threads. See the example below. yc is a continuous outcome, and yb is a binary outcome. using the same value for rngSeed in the call to bart2() yields the same prediction with yc, but not with yb. Is this something you can fix? Thanks!

library("dbarts")
packageVersion("dbarts")
#> [1] '0.9.19'

set.seed(12120)
x = rnorm(1000)
yc = rnorm(1000)
yb <- rbinom(1000, 1, .3)

d <- data.frame(x, yc, yb)

bc1 <- bart2(yc ~ x, data = d, n.chains = 2, n.threads = 2, n.trees = 5,
             rngSeed = 1234, verbose = FALSE)
bc2 <- bart2(yc ~ x, data = d, n.chains = 2, n.threads = 2, n.trees = 5,
             rngSeed = 1234, verbose = FALSE)
all.equal(bc1$yhat.train, bc2$yhat.train)
#> [1] TRUE

bb1 <- bart2(yb ~ x, data = d, n.chains = 2, n.threads = 2, n.trees = 5,
             rngSeed = 1234, verbose = FALSE)
bb2 <- bart2(yb ~ x, data = d, n.chains = 2, n.threads = 2, n.trees = 5,
             rngSeed = 1234, verbose = FALSE)
all.equal(bb1$yhat.train, bb2$yhat.train)
#> [1] "Mean relative difference: 0.185394"

Created on 2021-01-21 by the reprex package (v0.3.0)

vdorie commented 3 years ago

Thanks! I did something stupid with how k is tracked when it has a hyperprior. I'll try to get a fix checked in in the next few days.

vdorie commented 3 years ago

Hopefully fixed in 56fa57871c47c8fe76cf126e4392962341c83292. It'll probably be a bit before I can push to CRAN, since I just submitted earlier this month.

bachlaw commented 3 years ago

Vincent, came here to worry about this same issue but installed the updated version and can confirm results are now reproducible on binary outcome with multiple threads. Thanks very much.