rmcgibbo / pyhmc

Hamiltonain Monte Carlo in Python
https://pythonhosted.org/pyhmc
Other
37 stars 17 forks source link

split the logllikelihood and its gradient #9

Open zhou13 opened 8 years ago

zhou13 commented 8 years ago

See Issues #8.

zhou13 commented 8 years ago

Not sure why the checks failed. It seems not related to the tests. It should pass all the nosetests.

rmcgibbo commented 8 years ago

Okay, I fixed the travis issues in #10, so if you rebase this PR on the current master, the tests should run.

zhou13 commented 8 years ago

I rebased my branch and did a git push --force but it seems that the problem is still there. Could you have a check?

rmcgibbo commented 8 years ago

Is there a way that you can make it backwards compatible with the old API? Maybe using the same pattern that the jac argument to scipy.optimize.minimize uses, taking a callable or True? The reason is that for some of my applications, the gradient and objective have so many terms in common that once you've computed the objective, the gradient is basically free (and visa versa), so computing them from the same callback is more efficient than having two separate callbacks.

rmcgibbo commented 8 years ago

I rebased my branch and did a git push --force but it seems that the problem is still there. Could you have a check?

Hopefully fixed by #11.

zhou13 commented 8 years ago

Is there a way that you can make it backwards compatible with the old API? Maybe using the same pattern that the jac argument to scipy.optimize.minimize uses, taking a callable or True.

That's a good idea. I will implement that.