Add approximations to tape.gradient and tape.jacobian. Resolves #206.
To test derivatives of various library functions, I have been comparing the autodiff values to a five point stencil on the forward pass. Test utilities approximate_derivative and approximate_derivative_unsummed were developed for this purpose. However, these functions still required extra boilerplate and did not have the same interface as TensorFlow's built in tape derivative functions.
To simplify derivative testing, here two new utility functions are implemented: approximate_gradient and approximate_jacobian. These are written such that, for function f and possibly nested structure vars of tf.Variables participating in the calculation of f(), if we write
with tf.GradientTape as tape:
value = f()
auto_grad = tape.gradient(value, vars)
approx_grad = test_util.approximate_gradient(f, vars)
then approx_grad is a five point stencil approximation to auto_grad.
Add approximations to
tape.gradient
andtape.jacobian
. Resolves #206.To test derivatives of various library functions, I have been comparing the autodiff values to a five point stencil on the forward pass. Test utilities
approximate_derivative
andapproximate_derivative_unsummed
were developed for this purpose. However, these functions still required extra boilerplate and did not have the same interface as TensorFlow's built in tape derivative functions.To simplify derivative testing, here two new utility functions are implemented:
approximate_gradient
andapproximate_jacobian
. These are written such that, for functionf
and possibly nested structurevars
oftf.Variable
s participating in the calculation off()
, if we writethen
approx_grad
is a five point stencil approximation toauto_grad
.