junxnone / tio

Log
Other
10 stars 5 forks source link

keras mnist analysis #15

Open junxnone opened 5 years ago

junxnone commented 5 years ago

--> #8

https://github.com/keras-team/keras/tree/master/examples

TO-DO

  • [x] mnist_cnn.py
  • [x] mnist_mlp.py
  • [ ] mnist_hierarchical_rnn.py
  • [ ] mnist_acgan.py
  • [ ] mnist_dataset_api.py
  • [ ] mnist_denoising_autoencoder.py
  • [ ] mnist_irnn.py
  • [ ] mnist_net2net.py
  • [ ] mnist_siamese.py
  • [ ] mnist_sklearn_wrapper.py
  • [ ] mnist_swwae.py
  • [ ] mnist_tfrecord.py
  • [ ] mnist_transfer_cnn.py
  • [ ] tensorboard_embeddings_mnist.py
  • [ ] vae_mlp_mnist.h5

Index

junxnone commented 5 years ago

mnist_cnn.py

Trains a simple convnet on the MNIST dataset.

junxnone commented 5 years ago

mnist_mlp.py

Trains a simple deep multi-layer perceptron on the MNIST dataset.

junxnone commented 5 years ago

mnist_hierarchical_rnn.py

Trains a Hierarchical RNN (HRNN) to classify MNIST digits.

Layers

Input Layer

Input() 用于实例化 Keras 张量。 Keras 张量是底层后端(Theano, TensorFlow or CNTK) 的张量对象,我们增加了一些特性,使得能够通过了解模型的输入 和输出来构建Keras模型。

input_layer.py

Input(shape=None, batch_shape=None,
          name=None, dtype=None, sparse=False,
          tensor=None):

LSTM

长短期记忆网络层 - Hochreiter 1997. recurrent.py

keras.layers.LSTM(units, activation='tanh', recurrent_activation='hard_sigmoid', use_bias=True, kernel_initializer='glorot_uniform', recurrent_initializer='orthogonal', bias_initializer='zeros', unit_forget_bias=True, kernel_regularizer=None, recurrent_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, recurrent_constraint=None, bias_constraint=None, dropout=0.0, recurrent_dropout=0.0, implementation=1, return_sequences=False, return_state=False, go_backwards=False, stateful=False, unroll=False)

https://blog.csdn.net/jiangpeng59/article/details/77646186 http://deeplearning.net/tutorial/lstm.html

TimeDistributed

这个封装器将一个层应用于输入的每个时间片。 输入至少为 3D,且第一个维度应该是时间所表示的维度。

wrappers.py

keras.layers.TimeDistributed(layer)

https://blog.csdn.net/oQiCheng1234567/article/details/73051251

Dense

就是普通的全连接层。 Dense 实现以下操作: output = activation(dot(input, kernel) + bias) 其中 activation 是按逐个元素计算的激活函数,kernel 是由网络层创建的权值矩阵,以及 bias 是其创建的偏置向量 (只在 use_bias 为 True 时才有用)。

keras.layers.Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None)

Reference

junxnone commented 5 years ago

mnist_acgan.py

Train an Auxiliary Classifier Generative Adversarial Network (ACGAN) on the MNIST dataset.

Reference

junxnone commented 5 years ago

mnist_siamese.py

Trains a Siamese MLP on pairs of digits from the MNIST dataset.

References

junxnone commented 5 years ago

mnist_dataset_api

MNIST classification with TensorFlow's Dataset API.

Layers

Reference

junxnone commented 5 years ago

mnist_swwae.py

Trains a stacked what-where autoencoder built on residual blocks on the MNIST dataset.

References

junxnone commented 5 years ago

mnist_transfer_cnn.py

Transfer learning toy example. 1 - Train a simple convnet on the MNIST dataset the first 5 digits [0..4]. 2 - Freeze convolutional layers and fine-tune dense layers for the classification of digits [5..9].

Layers