dlsyscourse / hw2

5 stars 16 forks source link

tests for ReLU do not provide negative inputs #9

Open navalnica opened 1 year ago

navalnica commented 1 year ago

test_nn_relu_forward_1() and test_nn_relu_backward_1() both pass non-negative input to ReLU layer. This does not allow to test how layer works for negative inputs.

Inputs get generated as x = get_tensor(*shape) and get_tensor() function samples data uniformly from [0; 5[ interval:

def get_tensor(*shape, entropy=1):
    np.random.seed(np.prod(shape) * len(shape) * entropy)
    return ndl.Tensor(np.random.randint(0, 100, size=shape) / 20, dtype="float32")