-
This might be a documentation request, or a feature request.
Currently the AutoMALA constructor takes the argument `default_autodiff_backend`, which it uses via LogDensityProblemsAD to differentiat…
-
-
Test autodiff where the operators with sensitivity maps (different settings with multi-coils)
-
Hi @j-fu!
As of this morning's release, DifferentiationInterface is starting to look like a solid replacement for SparseDiffTools in the realm of sparse Jacobians and Hessians. Would you be intereste…
-
**Describe the bug**
A clear and concise description of what the bug is, ideally within 20 words.
Wrong gradient when using taichi autodiff.grad and pytorch autodiff.function together.
**To…
-
In the notebooks on the documentation here:
https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html
The author describes topics that he would like to showcase in a future Autodiff co…
-
I tried using static vectors for parameters, which works nicely. But not when combined with autodiff:
```julia
julia> using Optimization, OptimizationOptimJL, StaticArrays, ForwardDiff
# don't sp…
-
Is it possible to do a try/catch on the evaluation of the Jac to turn on autodiff?
https://github.com/SciML/DifferentialEquations.jl/blob/master/src/ode_default_alg.jl#L49
-
We could have possibly the following examples:
- [ ] A simple U-Net (pulled from fastMRI dataset possibly), trained? @Lenoush can you handle this in your free time?
- [ ] A network built with DeepIn…
-
## Feature request
It would be great if Numba supported automatic differentiation. Maybe using [Enzyme](https://github.com/EnzymeAD/Enzyme) would be the easiest way as it operates directly on th…