-
`scalar_minimize` does not calculate parameter uncertainties (Related to #169). As such, I would expect the `stderr` and `correl` attributes to be `None` if I do a fit using that method. Indeed, a `P…
-
In scipy.optimize.basinhopping if one specifies:
minimizer_kwargs["method"] = "anneal"
and furthermore defines accept_test and take_step as needed, the resultant new sample draws of x make no re…
-
I have just run optimize with Nelder-Mead method on a simple function with one maximum, and it yielded a wrong result. Code, output with comparison with LBFGS below :
--- code ---
using Optim
funct…
ahdxb updated
10 years ago
-
Now we have to write `require` for each method.
``` ruby
require 'minimization'
require 'nelder_mead'
require 'powell'
Minimization::Nelder_Mead.new()
Minimization::PowellMinimizer.new()
...
```
But…
-
-
I've needed to make three minor fixes to lmfit, listed below, to get scalar minimization with the conjugate gradient (cg) algorithm working. All changes are to lmfit/minimizer.py:
#1. Rationale: The w…
sdh4 updated
10 years ago
-
I just tried the following code from the help doc:
``` julia
using Optim
f(x) = 2x^2+3x+1
optimize(f, -2.0, 1.0)
```
and got the following error:
no method optimize(Function,Float64,Float64)
passin…
-
(Previously: **"Lambdax size mismatch" when profiling on flexLambda branch**)
This may already be known, but:
```
library(lme4)
fm1 fm1
-
Hi there,
great work. This interface really improves the handling of numpy.optimize.
The issue:
When passing method="nelder" to minimize, a leastsquare fit is performed.
This is due to the code fragm…
ghost updated
10 years ago
-
e.g.
x=rgk(20, 3,1,2,0.5)
gk.mle(x, theta0=rep(5,4))