-
- Optimizer
- GradientDescentOptimizer
- AdagradOptimizer
- AdagradDAOptimizer
- MomentumOptimizer
- AdamOptimizer
- FtrlOptimizer
- RMSPropOptimizer
# Reference
- [ ] [An over…
-
I don't know if this is a sherpa issue or GPyOpt, but since sherpa is calling gPyOpt and determines what versions of libraries are installed, I think this is the place to start.
This minimal exampl…
-
Nonlinear optimization algorithms that leverage just the gradient information (i.e. "first-order methods") can have trouble traversing through the cost function as the hessian becomes ill-conditioned.…
-
Thanks for the nice library! This is vary convenient and works very well for my jobs.
I was wondering if search algorithm is sensitive to CPU type. Firstly, I set search spaces for two variables …
-
It is a classical idea to overlap the backward pass and the optimization step. PyTorch supports this overlapping in DDP and FSDP. For example, here are hooks in DDP https://github.com/pytorch/pytorch/…
-
### Motivation
Optuna currently supports only single-objective CMA-ES for sampling, it would be useful to have multi-objective CMA-ES as a lot of tasks need to consider multiple objectives to be op…
-
I've come across the scenario of using an optimization algorithm, but did not have the time in the rest of the day to let the optimizer complete and find a solution. I was curious if you thought about…
-
First off, I'm enjoying this repo, and you have already solved a big chunk of what my team and I are in process of doing so I'm excited about contributing.
I've been looking for a process to connec…
-
## 🚀 Feature
The option for constraint-aware optimization techniques within torch.optim. [Sequential Quadratic Programming](https://en.wikipedia.org/wiki/Sequential_quadratic_programming) is a very c…
-
Hello,
A great effort has been made in the last years to add new optimization algorithms in the software, for example pagmo, NLopt, bonmin, dlib, etc. However, these capabilities are not visible in…