Closed Twenty1111 closed 1 year ago
This is an auto-generated grading output. Your code failed to run. Please check again.
This is an auto-generated grading output. Your code failed to run. Please check again.
This is an auto-generated grading output. Your code failed to run. Please check again.
This is an auto-generated grading output. Checking code of Twenty1111 {'Twenty1111': nan}
Since we don't know about the range of x_train and y_train, derivative of loss can be enormous value to update w and b. Therefore, in this code, weights and biases of test case 2, 3 diverges. I think you can fix this problem by adjusting the learning rate.
This issue is closed now because of the lack of progress.
Problem
Week 2_Problem 1
Source Code
Description
..
Output (Optional)
No response