Open djrakita opened 5 years ago
@djrakita This is indeed a valid concern that we hadn't thought about since we mainly focused on the code generation use case (e.g., using opengen
in Python).
You are describing a case where cost functions and their gradients have an inner state.
The key question is whether you need to update that state inside OpEn (that is, throughout the iterations of the numerical solver), or outside of it. In the former case you can simply create a new Fn
every time you call the solver. The latter case is more interesting and requires some modification in OpEn.
I think the most systematic way to go is to change the trait constraints on CostType
and GradientType
to:
Fn(&[f64], &mut f64, &mut [f64]) -> Result<(), SolverError>
that is, to provide another mutable argument, which one can use as a workspace. An even better approach would be to impose the trait constraints:
GradientType: Fn(&[f64], &mut [f64], &mut Workspace) -> Result<(), SolverError>,
CostType: Fn(&[f64], &mut f64, &mut [f64], &mut Workspace) -> Result<(), SolverError>,
where Workspace
is a trait that the users can implement.
This will take some time to implement. In the meantime, would you consider using opengen
in Python to generate Rust code for your problem? opengen
uses CasADi to compute gradients using automatic differentiation and generates C code, which we co-compile with the core OpEn solver.
@djrakita I tried to change Fn
to FnMut
, but it can't work. There are cases in the solver [1] where we need to lend such functions, of type Fn
, to some structures and this cannot be done using FnMut
.
We'll put forward the implementation of proper stateful cost, gradient and constraint functions. As a temporary solution you can either use opengen
in Python or use part of u
(which is mutable) as a work space (I would recommend the first option).
optimization_engine::LipschitzEstimator
Yeah, FnMut
will be an issue so I'd also recommend having a "workspace".
Quite intriguing idea, as this would help make the solver even more generic.
Hi! I am very new to Rust, so I apologize if this ends up being a naive question.
So, for my robot motion optimization application, I have created an objective function and gradient function in the required format using closures, but the closure types are FnMut’s rather than Fn’s. This leads to the compiler complaining since the OpEn library forces the objective and gradient functions to be Fn’s. However, it is necessary in my case that the closures be able to mutate data from outside of their function. For example, the gradient is calculated using hyper-dual automatic differentiation, which has to be able to access and mutate aspects of the objective function variables. Further, the objective is calculated by mutating a reference to a large robot manipulator kinematic and dynamics model, that would be extremely costly to reinitialize at each update in order to meet the Fn requirement.
Would it be possible to accommodate closure types of FnMut for the objective and gradient? As far as I can tell, this shouldn’t change anything at the optimization level: the objective will still just output an f64, and the gradient will still update accordingly. Any help would be appreciated. Thanks!