stephens999 / ashr

An R package for adaptive shrinkage
GNU General Public License v3.0
79 stars 35 forks source link

postsd much slower than postmean #134

Open william-denault opened 2 years ago

william-denault commented 2 years ago

Hello,

I was profiling some code, and I noticed that postsd was much slower than postmean (for normal mixture), which seems strange, given how these quantities are computed. Please find a minimal example below:

`library(ashr) library(microbenchmark)

Bhat <- rnorm(10000) Shat <- runif(10000) out <- ash(Bhat,Shat, mixcompdist="normal")

Bhat <- rnorm(10000) Shat <- runif(10000) m <- set_data(Bhat, Shat)

microbenchmark( postmean = postmean( get_fitted_g(out),m), postsd = postsd( get_fitted_g(out),m) ) `

pcarbo commented 2 years ago

@william-denault Could you please share your benchmarking results?

william-denault commented 2 years ago

expr min lq mean median uq max neval postmean 102.7708 113.6098 134.5917 117.8156 130.2786 294.1863 100 postsd 225.2152 238.8804 277.1865 252.2738 325.9447 420.9524 100

pcarbo commented 2 years ago

Yes, indeed it is about 2–3x slower. I don't see that as being a big problem, do you? Were there other settings where the difference in runtime was much larger?

william-denault commented 2 years ago

I was profiling fsusie, which uses these two functions a lot at every loop (2^S x number of covariates) so I just thought that could be place to gain speed (I will have a look at the counterpart of postsd and postmean in ebnm as Matthew suggested to me).

pcarbo commented 2 years ago

Okay, let us know what you find out.

william-denault commented 2 years ago

I have run some benchmark comparison and ebnm is actually 1.5 slower than ash when computing posterior quantities. I see the same pattern (posterior sd computation being slower than posterior mean). Furthermore it is seems that running ash with outputlevel= 0 is almost as fast as ebnm using prior_family_family="normal_scale_mixture". To be fair ebnm uses more component than ash.

pcarbo commented 2 years ago

That's helpful to know, thanks William!