ARCHIVED: Contains historical course materials/Homework materials for the FREE MOOC course on "Creative Applications of Deep Learning w/ Tensorflow" #CADL
I have fixed the code in the lecture-02 IPhyton notebook.
1) In the Learning Rate section.
This changes cast the init_p variable as an integer in order for it to be accepted in Python 3.6 as a viable index for the gradient, x and cost arrays.
This issue was mentioned in the class forum here
2) In the Over vs. Underfitting section
Fixed the power range in section to start at value 1. This avoids having a 0th power expansion which introduces additional bias. This issue was raised in the session 02 forum here
Hi Parag,
I have fixed the code in the lecture-02 IPhyton notebook. 1) In the Learning Rate section. This changes cast the init_p variable as an integer in order for it to be accepted in Python 3.6 as a viable index for the gradient, x and cost arrays. This issue was mentioned in the class forum here
2) In the Over vs. Underfitting section Fixed the power range in section to start at value 1. This avoids having a 0th power expansion which introduces additional bias. This issue was raised in the session 02 forum here