mikeudacity / ai-term2-beta-feedback

Please use the waffleboard for this Repo to provide feedback. Thank you.
0 stars 0 forks source link

CNNs Overall #100

Open initmaks opened 7 years ago

initmaks commented 7 years ago

Alexis' videos When Do MLPs (Not) Work Well? Why do we need to rescale/normalize the input image data? Why do we need to apply dropout there? Slide 14. After 'Let's specify this model in Keras'. Would be good to know why to apply Relu to hidden layers? what the sequential model stands for? Maybe also good to show how the total number of parameters from 784 became 669706 (multiplied by a number of hidden neurons etc). Slide 15 - Why rmsprop was chosen as optimizer? Could be explained below the video. Slide 16 - It will be not clear for a beginner what is batch size, the number of epochs, verbose are? Slide 22 - No reference that the LeNet is shown

The first half of the Local connectivity how the layers are connected would fit better somewhere before the When Do MLPs (Not) Work Well? video and before the coding starts.

Before the Convolutional Layers video, video by Luis (https://www.youtube.com/watch?v=2-Ol7ZB0MmU) would be perfect fit, as it is much easier to understand and will make difficulty transition a much smoother.

Strides And Padding - It was much better explained in the DLND videos by Vincent, may be because of the way it was presented with animations etc.

Pooling Layers - Nice. Sliding Average is out of date?

CNNs In Keras: Some intro to Keras library is needed. Checkponts are not explained + batch size, the number of epochs, verbose as mentioned above

Image Augmentation In Keras - very nice, but at the end, accuracy didn't improve :) (introduce Batch normalization to the model maybe?)

Visualizing CNNs - Some code provided would be nice so students can play with it and see what their network learns.

Transfer Learning - Content from SDC :D, it is good enough already.