-
A MWE of the issue I mentioned in the zoom call today:
```
a = [1.0]
da = [0.0]
function f(x); y = copy(x); return sum(y); end
function g(x); y = deepcopy(x); return sum(y); end
Enzyme.autodif…
-
Hi,
I'm trying to differentiate a function with millions of parameters with only a few parameters requiring grads. However, Enzyme would compute the gradients for all of them. I wondered whether it…
-
For each adjoint method, it would be convenient to include a table specifying which are compatible with each kind of autodifferentiation.
-
Hello,
I am trying to understand exactly how auto-diff works with JAX. I have an example function:
```
def fnc_jax(x1, x2):
return (jnp.divide(x1,x2) - jnp.exp(x2))*(jnp.sin(jnp.divi…
-
Another zero gradient in structs issue. I am on Julia 1.8.5 and Enzyme main (2ccf4b). This works:
```julia
using Enzyme
const n_threads = Threads.nthreads()
const n = 100
struct I
x::Flo…
-
I am trying to implement a function to compute hessians of an array of functions (it's actually a single function with vector output but functions are created to output individual indexes to allow the…
-
Example:
```julia
using Enzyme, Functors, Optimisers
struct MyShift{T}
a::T
end
Functors.@functor MyShift
(s::MyShift)(x) = x .+ s.a
s = MyShift(ones(2))
x = randn(2)
# `des…
-
I was trying to get Enzyme.jl to work nicely with Bessels.jl which computes a bunch of different special functions (Bessel, Airy, Gamma.. etc). Some of the functions seemed to work very well (mainly t…
-
I just noticed I got the wrong gradients for some of my functions. When I modify a vector stored in a struct, the gradients aren't getting propagated. Here is a MWE
```julia
struct Test{T}
b…
-
Found in https://github.com/SciML/LinearSolve.jl/pull/377, the Krylov methods have their own issue. MWE:
```julia
using Enzyme, ForwardDiff
using LinearSolve, LinearAlgebra, Test
n = 4
A = ra…