HIPS / autograd

Efficiently computes derivatives of NumPy code.
MIT License
6.95k stars 908 forks source link

Gradient check fails in gmm.py example #377

Open b4be1 opened 6 years ago

b4be1 commented 6 years ago

When I run gradient check in examples/gmm.py

from autograd.test_util import check_grads
check_grads(flattened_obj, modes=['rev'], order=1)(flattened_init_params)

it fails. Optimization, as a consequence, also doesn't work. If I use SLSQP solver and let it compute gradients numerically, optimization proceeds in the right direction, so I suspect something might have changed in autograd, such that some operations do not get recorded for the backward pass.

b4be1 commented 6 years ago

The problem seems to be in the mvn.logpdf function. A simple hand-made implementation

def mvn_logpdf(X, mu, cov_sqrt):
    return - 0.5 * np.log(np.linalg.det(2 * np.pi * cov_sqrt.T @ cov_sqrt)) \
           - 0.5 * np.sum(((X - mu) @ np.linalg.inv(cov_sqrt))**2, axis=1)

gives the correct gradient and makes the optimization converge to a local optimum.

kasakh commented 6 years ago

Thank you! It solved my problem too.