Closed real-zhangzhe closed 4 years ago
And numpy-ml/numpy_ml/neural_nets/wrappers/wrappers.py
Line 207:
your code
def backward(self, dLdy, retain_grads):
"""
retain_grads: Default is True
"""
Your code is missing the default value, resulting in an error at numpy-ml/numpy_ml/neural_nets/layers/layers.py
Line 332.
And numpy-ml/numpy_ml/neural_nets/layers/layers.py
Line 2116:
your code
def backward(self, dLdy):
"""
retain_grads: Default is True
"""
The function in your code is missing an argument retain_grads
, resulting in an error at numpy-ml/numpy_ml/neural_nets/wrappers/wrappers.py
Line 227.
Maybe I should fork and create pull request :smiley:
And numpy-ml/numpy_ml/neural_nets/tests/tests.py
line 510:
your code
from ..activations import Softmax
but
Softmax
is not implemented in ..activations
, but in ..layers
.
right code
from ..layers import Softmax
y_pred = sm.forward(z)
And numpy-ml/numpy_ml/neural_nets/tests/tests.py
line 771:
your code
from ..activations import SoftSign
but SoftSign is not implemented in ..activations
.
maybe you should delete function test_softsign_grad
and test_softsign_activation
.
@Z-zhe - Wow, thanks so much for all these! I haven't had a chance to take a look yet, but should have some time this weekend. In the meantime if you feel like submitting a PR with fixes I'd be happy to review it, otherwise I can try to address these shortly.
PR is complicated, it is easier for you to modify. :smiley:
numpy-ml/numpy_ml/neural_nets/layers/layers.py
line 2341:
your code
dX = dZ @ W.T
I don't think it should be W
it should be W_sparse
. So I think right code should be:
dX = dZ @ W_sparse.T
Please reconsider,thanks.
Hi, I think there is a little bug at numpy-ml/numpy_ml/neural_nets/activations/activations.py Line 64.
your code
but
self.fn_x
Never defined