Open ssaru opened 5 years ago
Binarized MLP 구현 (MNIST)
3 hidden layers of 1024 Rectifier Linear Units (ReLU) (v)
L2-SVM output layer(Layer하나 더 추가로 변경)(v)
square hinge loss(F.cross_entropy로 변경)(v)
F.cross_entropy
SGD without momentum (v)
Batch Normalization (v)
Not use any data-augmentation, preprocessing, unsupervised pretraining (v)
minibatch of size 200 (v)
exponential learning rate decay (v)
Early stopping (v)
Dropout(optional)(v)
regularization (optional) (v)
Binarized CNN 구현 (CIFAR-10) -(2×128C3)−MP2−(2×256C3)−MP2−(2×512C3)−MP2−(2×1024F C)−10SV M (v)
벤치마크 용 ResNet, CNN 추가
Loss Graph
Weight histogram
Weight or Activation Visualization
Binarized MLP 구현 (MNIST)
3 hidden layers of 1024 Rectifier Linear Units (ReLU) (v)
L2-SVM output layer(Layer하나 더 추가로 변경)(v)
square hinge loss(
F.cross_entropy
로 변경)(v)SGD without momentum (v)
Batch Normalization (v)
Not use any data-augmentation, preprocessing, unsupervised pretraining (v)
minibatch of size 200 (v)
exponential learning rate decay (v)
Early stopping (v)
Dropout(optional)(v)
regularization (optional) (v)
Binarized CNN 구현 (CIFAR-10) -(2×128C3)−MP2−(2×256C3)−MP2−(2×512C3)−MP2−(2×1024F C)−10SV M (v)
벤치마크 용 ResNet, CNN 추가
Loss Graph
Weight histogram
Weight or Activation Visualization