-
This might be a documentation request, or a feature request.
Currently the AutoMALA constructor takes the argument `default_autodiff_backend`, which it uses via LogDensityProblemsAD to differentiat…
-
-
Test autodiff where the operators with sensitivity maps (different settings with multi-coils)
-
Hi @j-fu!
As of this morning's release, DifferentiationInterface is starting to look like a solid replacement for SparseDiffTools in the realm of sparse Jacobians and Hessians. Would you be intereste…
-
**Describe the bug**
A clear and concise description of what the bug is, ideally within 20 words.
Wrong gradient when using taichi autodiff.grad and pytorch autodiff.function together.
**To…
-
In the notebooks on the documentation here:
https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html
The author describes topics that he would like to showcase in a future Autodiff co…
-
I found that in teqp one of the models was failing because the derivatives of ``pow(delta, int)`` are not calculated properly when ``delta`` is ``0``. Here is a Catch2 test snippet showing the problem…
-
### 🐛 Describe the bug
# AssertionError: RuntimeError not raised
"test_0dim_tensor_overload_exception_xpu",
# RuntimeError: ceil is not supported for complex inputs
"test_autodiff_…
-
I tried using static vectors for parameters, which works nicely. But not when combined with autodiff:
```julia
julia> using Optimization, OptimizationOptimJL, StaticArrays, ForwardDiff
# don't sp…
-
Is it possible to do a try/catch on the evaluation of the Jac to turn on autodiff?
https://github.com/SciML/DifferentialEquations.jl/blob/master/src/ode_default_alg.jl#L49