zhenglei-gao / KineticEval

An R package for kinetic evaluations
4 stars 4 forks source link

Gerald's Case: DFOP- nonsense CIs #27

Open zhenglei-gao opened 10 years ago

zhenglei-gao commented 10 years ago

Gill PE, Murray W, Wright MH(1981) Practical Optimization.

zhenglei-gao commented 10 years ago

The asymptotic theory behind the formula for 's.e.' breaks down with parameters at boundaries. It assumes that you are minimizing the negative log(likelihood) AND the optimum is in the interior of the region AND the log(likelihood) is sufficiently close to being parabolic that a reasonable approximation for the distribution of the maximum likelihood estimates (MLEs) has a density adequately approximated by a second-order Taylor series expansion about the MLEs. In this case, transforming the parameters will not solve the problem. If the maximum is at a boundary and if you send the boundary to Inf with a transformation, then a second-order Taylor series expansion of the log(likelihood) about the MLEs will be locally flat in some direction(s), so the hessian can not be inverted.

These days, the experts typically approach problems like this using Monte Carlo, often in the form of Markov Chain Monte Carlo (MCMC). One example of an analysis of this type of problem appears in section 2.4 of Pinheiro and Bates (2000) Mixed-Effects Models in S and S-Plus (Springer). https://stat.ethz.ch/pipermail/r-help/2008-June/165928.html

zhenglei-gao commented 10 years ago

there is really no way to get around this problem apart from having a good initial guess

zhenglei-gao commented 10 years ago

http://cowles.econ.yale.edu/P/cp/p09b/p0988.pdf

ESTIMATION WHEN A PARAMETER IS ON A BOUNDARY

zhenglei-gao commented 10 years ago

Standard error can be incorrect:

zhenglei-gao commented 10 years ago

https://groups.google.com/forum/#!topic/comp.soft-sys.matlab/7luxU61mjVk

Or, if your likelihood was Gaussian, and you were just solving a nonlinear least squares problem, then it would be straight forward to compute the Jacobian (matrix of first derivatives) by finite differencing and then use the approximation H=J'*J. I would recommend against trying to compute the second derivatives directly by second order finite differencing- in my experience it just doesn't work well in practice.

Another approach that might be more practical would be to sample from your likelihood (or more generally posterior) distribution using Markov Chain Monte Carlo methods. If the likelihood can be computed relatively quickly, then this can be a very effective technique.

A third approach that you might consider is building a quadratic metamodel of the likelihood near the optimal parameter values.

zhenglei-gao commented 10 years ago

http://sci.tech-archive.net/Archive/sci.math.num-analysis/2005-07/msg00024.html