Closed shiinalight closed 1 year ago
Hello, At a first glance, in UNQ_C4 there is a difference in the code you wrote vs the grad_b1 and grad_b2 that use np.sum(...) from here After that, UNQ_C5 uses the back_prop function.
Hello, First of all thanks for taking time for my code; your help is really appreciated I tried multiple times yesterday to apply what you said but I got more tests failed or other errors just if you had the time and have access to the Coursera related lab would you correct it on the file and resend it; I'm so sorry for asking it just if you had the time and felt like it, if not it is perfectly ok
Hello, I am not very good at this, but I will gladly take a look. I lost access to the course a while ago could you also attach the w4_unittest file? The version I am using does not use unittests so I cannot validate the final output.
https://github.com/shiinalight/NLP-labs/blob/main/C2_W4_Assignment.ipynb Does this work? or if what you need is sth else can you provide me where should I get it for you?
If you open the assignment from coursera, you will have access to the project file structure(somewhere on the upper right side in the menus). Inside you should be able to locate the unit tests.
Here you go sorry for the wrong file
ok, so what I found so far is this.
in exercise 04:
If you want to allow negative values in the output of the ReLU activation function, you should not apply the ReLU activation to the l1 values used to compute the gradient of the bias vector b1. meaning that you should delete or comment
l1[l1 < 0] = 0
in exercise 05, I am missing a file, but I guess that fixing the _backprop() fixed the tests. I get: FileNotFoundError: [Errno 2] No such file or directory: './support_files/gradient_descent/w1.pkl'
let me know if this works
some of my tests fail in these two problems and therefore I cannot pass this could you please take a look? C2_W4_Assignment.pdf