Open dschwen opened 2 years ago
Can we handle a parameter in a step function z>p ? 1 : 0
? I am questioning because I think it could be better to have number of parameters separate residual evaluations to get the derivatives before doing anything further. Assuming everything can be handled locally can be dangerous, otherwise we possibly could call computeResidual
multiple times without AD.
Have there been any updates on this issue? My group is interested in this particular use-case described by @dschwen.
@tegrubbs There is an optimization module that was merged that gives you an interface to the petsc optimization solver TAO. We haven't done anything for these automatic derivatives yet, @oanaoana is working on this.
Reason
Optimization problems (as @lynnmunday is working on as a PI) require the derivatives of the residual vector R of the problem with respect to the input parameters p, *∂R**/∂p***. We would like to utilize automatic differentiation for this.
The compute workflow would be similar to how the Jacobian matrix *∂R**/∂ui is built, except we would only need ∂**R/∂**p*** at the end of a time step.
Design
const ADReal & getOptimizableParam("name");
, wherename
is a parameter of typeReal
. It will be implicitly controllable. The method returns a const reference to a value managed by the framework. During the computation of the residual and Jacobian the underlyingADReal
value will have no derivative vector entries.computeResidual
again, but with empty residual vectors for all variables, and populated residual vectors for all optimizable parameters.Impact
Enable correct gradients for non-linear optimization problems