richarddmorey / BayesFactor

BayesFactor R package for Bayesian data analysis with common statistical models.
https://richarddmorey.github.io/BayesFactor/
132 stars 49 forks source link

Inf values in ttest.tstat #42

Closed nicebread closed 9 years ago

nicebread commented 9 years ago

ttest.tstat(1000, 100, 100)returns -Inf.

Wouldn't +Infbe more appropriate, as the BF is very large in this case?

richarddmorey commented 9 years ago

This is a problem with integrate() incorrectly returning 0 for the marginal likelihood integral. I'm not sure why. This code, for instance, fails when integrating from -Inf to Inf, but not with a "reasonable" range of integration:

t = 466
N = 50
df = 198
rscale = sqrt(2)/2
delta = seq(50, 80, len=100)
log.const=-13.94309

f = Vectorize(function(delta,t,N,df,rscale,log.const=0)
   exp(dt(t,df,ncp=delta*sqrt(N),log=TRUE) + dcauchy(delta,scale=rscale,log=TRUE) - log.const),
   "delta")
plot(delta,f(delta,t,N,df,rscale,log.const), ty='l')

integrate(f, -Inf, Inf, t = t, N = N, 
        df = df, rscale = rscale, log.const = log.const)

integrate(f, 50, 80, t = t, N = N, 
        df = df, rscale = rscale, log.const = log.const)

whereas if you set t=465, the integral works both ways. I might have to ask about this on R-help or something. The function appears perfectly well behaved in both cases, at least to me.

Edit: I asked over on stack overflow: http://stackoverflow.com/questions/27297339/problems-with-integrate-returning-integral-of-0

richarddmorey commented 9 years ago

I fixed this. Shifting the function so that it is around 0 works well. The fix will be in the next version.