gamlss-dev / gamlss

gamlss: Generalized Additive Models for Location Scale and Shape
https://CRAN.R-project.org/package=gamlss
11 stars 4 forks source link

Mean of generalized gamma distribution #12

Open mr-infty opened 4 months ago

mr-infty commented 4 months ago

Problem

The doc of GG claims that the mean of the distribution corresponding to the parameters (\mu, \sigma, \nu) is equal to \mu, but that does not appear to be true.

Details

The formula for the probability density given in the documentation matches up with the generalized Gamma distribution defined in https://arxiv.org/pdf/1005.3274, under the following mapping of parameters:

(\mu, \sigma, \nu) \mapsto (\theta = \mu (\sigma^2 \nu^2)^{1 / \nu}, \alpha = 1/(\sigma^2 \nu^2), \nu).

The formula for the mean given in loc. cit. then shows that the mean of the distribution is

\theta \frac{\Gamma(\alpha + 1/\beta)}{\Gamma(\alpha)}

which does not appear to equal \mu unless \nu = 1.

Numerical evaluations seem to support this claim.

zeileis commented 4 months ago

I agree. The \description{} in the Rd file (in gamlss.dist) seems to be incorrect and too simple:

https://github.com/gamlss-dev/gamlss.dist/blob/main/man/GG.Rd#L12-L16

The actual code for both $mean and $variance in the family is much more involved:

https://github.com/gamlss-dev/gamlss.dist/blob/main/R/GG.R#L89-L94

The computations appear to be in sync with empirical computations based on simulated data. Consider the following three theoritcal distributions:

library("distributions3")
library("gamlss.dist")
d <- GAMLSS("GG", mu = c(0.5, 1, 2), sigma = c(1.5, 1, 0.5), nu = c(2, 0.5, 1))
cbind(mean(d), sqrt(variance(d)))
##           [,1]      [,2]
## [1,] 0.2577079 0.4284701
## [2,] 1.2500000 1.3110111
## [3,] 2.0000000 1.0000000

And the following simulations from it:

set.seed(0)
y <- random(d, 10000)
cbind(apply(y, 1, mean), apply(y, 1, sd))
##           [,1]      [,2]
## [1,] 0.2526009 0.4211723
## [2,] 1.2449258 1.2982611
## [3,] 2.0047618 1.0025138

These seem to agree reasonably well.