Open thechaos16 opened 7 years ago
Standard neural network
Convolutional neural network
LSTM / recurrent neural network
Advanced deep learning features (e.g. PReLU, Batch normalization)
For Batch normalization in keras, parameter activation for Dense should be linear, and add BatchNormalization afterwards, and add other activation like PReLU at the end
activation
linear
Standard neural network
Convolutional neural network
LSTM / recurrent neural network
Advanced deep learning features (e.g. PReLU, Batch normalization)