Closed LiableFish closed 1 year ago
I've also added similar test for broadcast_to
method:
def test_broadcast_to_backward_my():
gradient_check(ndl.broadcast_to, ndl.Tensor(np.random.randn(3, 1, 5)), shape=(2, 3, 4, 5))
it combines checks for (1) different number of dimensions and for (2) broadcast over existing dimension
gradient_check(ndl.broadcast_to, ndl.Tensor(np.random.randn(3, 1, 5)), shape=(2, 3, 4, 5))
Added it too, good point
As numpy docs state
But no of current tests (either local or mugrade one) actually checks that you're trying to find broadcast dims iterating from the right and not from left during backward pass.
I was able to pass all the tests in HW1, but failed on one of the Linear module's test in HW2 just because I'm iterating from the left. Particularly I'm using
zip_longest
to match the input dims and required dims. So for theinput_dim = (5,)
andrequired_dim = (1, 5)
I'm dealing with(5, None) and (1, 5)
instead of(None, 5) and (1, 5)
. That's why I failed on the test. And that's why I suggest to add it in the HW1 :)