Closed mdaeron closed 3 years ago
@mdaeron Please explain why is this a bug in the code or close this issue and start a conversation on the mailing list, as (really, no kidding) clearly instructed.
It seems to me that since you linked to a question on the same topic on Stackoverflow, that you know that it is a question for discussion and clarification, which is exactly the situation for which we say "start a conversation on the mailing list, not here as an Issue in the libraries bug tracker". It is hard to believe that you did not understand this.
We've said this really so many times and in so many different ways that it is HUGE demotivator when people refuse (yes, I do mean that word to convey a deliberate decision) to understand this.
Sure, I suppose we could just ignore all of our recommendations for how to ask questions. This is free software written and maintained entirely by volunteers who have other jobs and interests. We have maintained and documented the library and gone out of our way to set up forums to try to help people to use these tools You sort of ignored all of that.
@newville when I asked the question on SE I was unsure if this was a bug. Since then, I discovered that least_squares
provides the correct values. Because least_squares
and leastsq
appear (going by the lmfit documentation) to use the same approach to compute variance-covariance, the fact that the latter yields inaccurate results very reasonably suggests that this is indeed a bug, so I opened the present issue (adding the comparison between methods). If, on the other hand, this is an (undocumented) feature or if you're just not interested, please just close the issue.
At any rate, I stand by my initial bug report. YMMV, of course. Volunteers (who have other jobs and interests) reporting bugs is a feature of FOSS in my opinion.
Thanks again for lmfit
, and have a nice day.
First Time Issue Code
Yes, I read the instructions and I am sure this is a GitHub Issue.
Description
Standard errors for the best-fit parameters reported by
lmfit.minimize()
usingmethod='leastsq'
andscale_covar=False
are inaccurate when chi-square is very close to zero (i.e. when the model fits the data almost perfectly). Below I perform a straight-line regression ofY = [0,1,2]
againstX = [0,1,2]
, and add small, increasing amounts of (pseudo-)noisef
to the data. Only when the noise is large enough are the reported uncertainties equal to those expected from ordinary least squares.By contrast, using
method='least_squares'
yields the correct standard errors. As far as I know, the Levenberg-Marquardt ('leastsq'
) and Trust Region Reflective ('least_squares'
) methods differ in how they approach chi-square minimum but should use the same underlying variance-covariance calculations after that. I thus suspect there is a bug in theleastsq
method's code.A Minimal, Complete, and Verifiable example
Output:
Version information
Link(s)
https://stackoverflow.com/questions/66883326/inaccurate-parameter-uncertainties-from-lmfit-when-chi-square-is-close-to-zero