aamini / introtodeeplearning

Lab Materials for MIT 6.S191: Introduction to Deep Learning
MIT License
7.26k stars 3.66k forks source link

Lab 1 Defining a network layer - Assertion error #131

Open GoviKandula opened 1 year ago

GoviKandula commented 1 year ago

I am using Google Collab.

Getting the following error: Can someone please help me [[0.36430186 0.8134891 0.15117118]]

AssertionError Traceback (most recent call last) in <cell line: 38>() 36 # test the output! 37 print(y.numpy()) ---> 38 mdl.lab1.test_custom_dense_layer_output(y)

1 frames [... skipping hidden 2 frame]

/usr/local/lib/python3.10/dist-packages/numpy/testing/_private/utils.py in assert_array_compare(comparison, x, y, err_msg, verbose, header, precision, equal_nan, equal_inf) 842 verbose=verbose, header=header, 843 names=('x', 'y'), precision=precision) --> 844 raise AssertionError(msg) 845 except ValueError: 846 import traceback

AssertionError: Arrays are not almost equal to 7 decimals [FAIL] output is of incorrect value. expected [[0.36430186 0.8134891 0.15117118]] but got [[0.2697859 0.45750418 0.66536945]] Mismatched elements: 3 / 3 (100%) Max absolute difference: 0.5141983 Max relative difference: 0.7781019 x: array([[0.3643019, 0.8134891, 0.1511712]], dtype=float32) y: array([[0.2697859, 0.4575042, 0.6653695]], dtype=float32)

GoviKandula commented 1 year ago

Isnt Setseed give you always a constant random number? Everytime I run the block it gives a different value for y which is not same as what it contains in mdl.lab1.test_custom_dense_layer_output(y).

Can someone explain me what is happening please

jboverio commented 1 year ago

Just setting a seed does not guarantee that the weights and biases will be initialized to any specific set of values. It only guarantees that they will be initialized to the same set of values every time you run your code with the same seed.

In your case, it seems that the expected output of mdl.lab1.test_custom_dense_layer_output(y) is based on a specific set of weights and biases that is not being produced by your current weight and bias initialization.

Without knowing how the expected output was calculated, it's difficult to say what you would need to change in your weight and bias initialization to achieve that output. It could be that a different seed was used, or that the weights and biases were initialized using a different method, or even manually set to specific values.

ajitomatix commented 1 year ago

Please see #116 (similar thread) and #118 (solution). Reading these, it seems that this won't be addressed just by updating the collab notebook, mitdeeplearning package also needs to be updated.