-
This is what Theano do when used in Deep Learning with Python. We can do it by using packages from [JuliaDiff](https://github.com/JuliaDiff/) to compute the derivatives instead of using a derivative d…
-
For the samplers which use derivatives (HMC, NUTS, ...), is it possible to use AD instead of finite differences? Even doing it manually (calculating the derivative with eg `ForwardDiff.jl` and supplyi…
tpapp updated
7 years ago
-
A hot topic for neural networks is the use of automatic differentiation. I have seen the recent announcement of [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) which seems very promising in…
-
JAXsim currently focuses only on sampling performance, exploiting `jax.jit` and `jax.vmap`. Being written in JAX, the forward step of the simulation should be differentiable (also considering contact …
-
## Implementing Automatic Differentiator
This issue tracks the implementation of an automatic differentiator for the expression compiler. The compiler parses expressions into binary trees, and this…
-
### Motivation
Would anyone find an automatic differentiation module interesting? I'm not an expert in it, but I did once create a simple fortran module that worked by overloading the various arihtme…
-
### Forward AD
- [x] Non-Sparse Default AD should be `AutoPolyesterForwardDiff` (if that package is loaded). This will be similar to SimpleNonlinearSolve
### Reverse AD
- For in-place problem…
-
I suggest using automatic differentiation to avoid having to manually specify derivative functions.
-
Has there been any discussion of the usage case of automatic differentiation (AD) in relation to generics?
Related issues:
* https://github.com/j3-fortran/fortran_proposals/issues/95
* https://gi…
-
I think it would be smart to attempt to use the great tools already created if they can fit our needs, or improved to fit our needs. Here's a summary of what I hope to see in JuliaML, and my first im…