DeepBaksuVision / BinaryConnect

4 stars 4 forks source link

[MH/BinaryConnect] 코드 통합 #39

Open ssaru opened 5 years ago

ssaru commented 4 years ago
  1. Binarized MLP 구현 (MNIST)

    • 3 hidden layers of 1024 Rectifier Linear Units (ReLU) (v)

    • L2-SVM output layer(Layer하나 더 추가로 변경)(v)

    • square hinge loss(F.cross_entropy로 변경)(v)

    • SGD without momentum (v)

    • Batch Normalization (v)

    • Not use any data-augmentation, preprocessing, unsupervised pretraining (v)

    • minibatch of size 200 (v)

    • exponential learning rate decay (v)

    • Early stopping (v)

    • Dropout(optional)(v)

    • regularization (optional) (v)

  2. Binarized CNN 구현 (CIFAR-10) -(2×128C3)−MP2−(2×256C3)−MP2−(2×512C3)−MP2−(2×1024F C)−10SV M (v)

    • C3 is a 3 × 3 ReLU convolution layer(v)
    • MP2 is a 2 × 2 max-pooling layer(v)
    • FC a fully connected layer(v)
      • preprocessing global contrast normalization and ZCA whitening(normalize로 대체)
      • Not use any data-augmentation (v)
      • square hinge loss(cross entropy로 대체)
      • ADAM(v)
      • regularization (optional) (v)
      • Batch Normalization(v)
      • mini batch size is 50(v)
      • Optimizer ADAM, Nesterov momentum, SGD(optional)(v)
      • learning rate scheduler (optional)(v)
      • Dropout(Optional)(v)
  3. 벤치마크 용 ResNet, CNN 추가

    • Dropout(Optional)
    • BatchNormalization(Optional)
  4. Loss Graph

  5. Weight histogram

  6. Weight or Activation Visualization