Closed liamoconnor9 closed 2 years ago
The correct way to do this is to compute the Frechet differential, not to use sym_diff
.
Whoops didn't mean to close
The correct way to do this is to compute the Frechet differential, not to use
sym_diff
.
Ok my mistake. I think we can recycle a fair amount of existing code to make this in Dedalus. Would that be a useful feature? Or should I add that feature to my own project outside of Dedalus?
This already works -- it's how the Jacobian is computed for the NLBVP: https://github.com/DedalusProject/dedalus/blob/579b32b774eac77db9abbd1db9f546f171ece743/dedalus/core/problems.py#L397
Thanks for pointing me in the right direction, I'll take a look at that.
It'd be nice if we could differentiate dot products with respect to their (vector) arguments. This would be particularly useful for the adjoint-looping optimization work I'm doing.
Here's an example of the functionality I'm looking for:
Given a TensorField u, taking u.sym_diff(u) returns 1. If instead it returned an identity tensor of the proper rank, then the test case above would pass