Open kernel8liang opened 7 years ago
Related to #155, any fixes on this?
I will try to make a temporary solution in the weekend. In long term this part will be replaced by MXNet's NDArray subsystem which we are working on now.
We should still fix this. It is a quite important feature.
On Fri, Mar 24, 2017 at 2:40 PM, Larry Tang notifications@github.com wrote:
I will try to make a temporary solution in the weekend. In long term this part will be replaced by MXNet's NDArray subsystem which we are working on now.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/dmlc/minpy/issues/165#issuecomment-289110532, or mute the thread https://github.com/notifications/unsubscribe-auth/AD3qZWpgP4ne5oKfcXxhQUZcIp1TPErgks5rpA4wgaJpZM4MnwQf .
-- Minjie Wang New York University | Computer Science 715 Broadway, New York, NY, 10009
@jermainewang I found the reason now. Look at here: link. The function pushed into gradient record is unwrapped version. Which means that tape cannot record operations there. I think this change is due to performance concern. Should we fix it or wait autograd runtime? (By the way, does autograd runtime support higher order derivatives?
@lryta Do we have any follow-up on this?
Run autograd_tutorial example,
https://github.com/dmlc/minpy/blob/master/examples/tutorials/autograd_tutorial.ipynb
error below
minpy version 0.33.