-
I was checking the implementation of the `NedlerMeadSimplex` algorithm while looking at a few different code-implementations in order to better understand the algorithm and noticed a few discrepancies…
-
The default algorithm of the Optim minimizer is `Nelder-Mead()`, which is a gradient-free method. I'd like to explore gradient-based algorithms, such as `GradientDescent()` or `LBFGS()`.
Motivation:…
-
Various time series where NM ends up in a singular matrix
-
In `ETSModel.fit()`, the optimization method is hardcoded as "lbfgs": https://github.com/statsmodels/statsmodels/blob/main/statsmodels/tsa/exponential_smoothing/ets.py#L1068
Could it be parameteriz…
-
A nice suggestion by Simon Gascoin (@sgascoin). An example of a very simple algorithm that finds a horizontal shift between two DEMs by minimizing the NMAD. A good alternative to the Nuth & Kaab.
`…
-
Currently, optimas provides BO/MF-BO, random and grid sampling generators:
https://optimas.readthedocs.io/en/latest/api/generators.html
For many optimization jobs, it would also be helpful to have…
ax3l updated
5 months ago
-
Nelder-Mead optimization algorithm. Level 2 sub-optimization of main bi-level optimization. Minimazes loss value from Lev2_Loss_Function.py and uses Lin_Solver.py or Nonlin_Solver.py based on IsoSelec…
-
calling set_all_parameterizations on a full TP model to a GLND model is changing some outcome probabilities non-trivially.
**To reproduce:**
`from pygsti.modelpacks import smq1Q_XY as std
da…
-
import numpy as np
from scipy.optimize import minimize
def rosen(x):
"""The Rosenbrock function"""
return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)
def rosen_der(x):
…
-
**Is your feature request related to a problem? Please describe.**
This is a feature request for CMA-ES support at `scipy.optimize.minimize()`.
https://docs.scipy.org/doc/scipy/reference/generated…