-
1) For some reason, float values marked as `no_diff` are not treated as `no_diff` in higher order derivatives (i.e. `no_diff float` in a forward is also `no_diff float` in `bwd_diff(func)(...)` call, …
-
I'm looking to implement custom GPU ops similar to how tensorflow allows for defining custom jvps. Is there a similar tutorial/guide on how feasible this will be with jax?
-
I have existing code using autodiff v0.5.13 that I am migrating to v0.6.3 that uses both forward mode and reverse mode auto-differentiation in the same compilation unit.
In updating to the newer vers…
-
I am wondering if the 'tangent-space backpropagation' implemented here in "jaxlie.manifold.value_and_grad" is the same as that defined in the paper "Tangent Space Backpropagation for 3D Transformation…
-
When calculating sufficiently large Jacobians with ForwardDiff, the `forward` mapping $x \rightarrow y(x)$ is called over and over again for each chunk that `ForwardDiff.jacobian` computes.
Here is a…
-
Code:
```julia
import ADTypes, DifferentiationInterface, Enzyme
Enzyme.API.runtimeActivity!(true)
broadcast(
x -> sum(
DifferentiationInterface.pushforward(
x -> vca…
-
In the following code
```python
import jax
import jax.numpy as jnp
import numpy as np
import lineax as lx
N = 8
M = 16
K = 128
np.random.seed(42)
A = np.random.randn(M, N)
B = np.rand…
-
In `chirho.robust`, we're making extensive use of `torch.func`, especially the vectorization transform `torch.func.vmap` and forward and reverse-mode autodiff transforms `torch.func.jvp`/`vjp`.
Unf…
-
``` python
import jax
import jax.numpy as np
import numpy as onp
def E_fn(conf):
ri = np.expand_dims(conf, 0)
rj = np.expand_dims(conf, 1)
dxdydz = np.power(ri - rj, 2)
dij =…
-
[Differentiation Notation](https://en.wikipedia.org/wiki/Notation_for_differentiation)
- Leibniz's notation: $\frac{d^nf}{dx^n}$, $df=f'(x) \cdot dx$
- Lagrange's notation: $f^{(n)}(x)$
- D-notat…