-
I have a strange issue with backward() I have two generators, gen1 and gen2, I calculate loss on three ways, loss_1, loss_2, loss_3
All compute for gen1 are ok
Part 1.
let out = gen1.forward(inp…
-
Hello,
i have an issue with autodiff.py
could you help me please ?
Fred
ps : this is my first issue, i don't know very well the "bonnes pratiques"
All works expect the last line :
newtonSolv…
-
## Feature request
It would be great if Numba supported automatic differentiation. Maybe using [Enzyme](https://github.com/EnzymeAD/Enzyme) would be the easiest way as it operates directly on th…
-
**Describe the bug**
A clear and concise description of what the bug is, ideally within 20 words.
Wrong gradient when using taichi autodiff.grad and pytorch autodiff.function together.
**To…
zjcs updated
3 months ago
-
Implement automatic gradient calculation
-
Hi @j-fu!
As of this morning's release, DifferentiationInterface is starting to look like a solid replacement for SparseDiffTools in the realm of sparse Jacobians and Hessians. Would you be intereste…
-
Is it possible to do a try/catch on the evaluation of the Jac to turn on autodiff?
https://github.com/SciML/DifferentialEquations.jl/blob/master/src/ode_default_alg.jl#L49
-
I encountered unexpected behavior with a function and I am unsure what the issue is (compiled via `cargo +enzyme run --release`)
```rust
// this crashes
#[autodiff(df2f, Forward, Const, Dual, D…
-
Thanks you for sharing this library.
In some case using forward automatic differentiation can be useful, for example when solving non-linear least square problems, where having the jacobian matrix o…
-
avhz updated
2 months ago