Closed smao-astro closed 3 years ago
You can reimplement object-oriented version of jacfwd
or jacrev
similarly how objax.GradValues
is implemented.
Another option is to use combination of objax.Vectorize
and objax.GradValues
, in other words vectorize computation of gradient. DPSGD gradient module to some extend does it: https://github.com/google/objax/blob/c4785ff991f35dc1af5a68988e8a545d3304de90/objax/privacy/dpsgd/gradient.py#L76
You can not use functional jax transformations with Objax (like jax.vmap
, jax.pmap
, jax.grad
, etc...). In other words those transformations which takes function and returns a new function.
You can use other jax operations (for example all stuff from jax.numpy.*
) safely with Objax.
All JAX primitives are stateless and pure functional (i.e. don't have and don't assume side-effects). Objax provides wrappers for JAX primitives to simplify state management and make is more natural for machine learning applications.
So if you try to mix JAX functional tranformations with Objax primitives it will break the state management and either code won't work at all or will work incorrectly.
As I mentioned above, Objax provides wrappers which simplify state management.
So for example objax.Vectorize
is a wrapper over jax.vmap
which does the state management and enables usage of stateful Objax primitives with stateless JAX.
I see, thank you for your explanation!
- Compute Jacobian.
You can reimplement object-oriented version of
jacfwd
orjacrev
similarly howobjax.GradValues
is implemented.
It is hard to implement object-oriented version of jacfwd
or jacrev
, because jacfwd
or jacrev
do not support return auxiliary data. Is there solution? Thanks!
Hi,
I am new to JAX and Objax, and I would like to compute the "partial derivative" of outputs w.r.t. inputs, below is a piece of code
The doc suggests do not mix JAX and Objax's transformation, and my question is:
jacfwd
orjacrev
, so what is the standard way to calculate Jacobian?vmap
andobjax.Vectorize
?Thanks.