jaentrouble / mouse_chaser_train_1

train mouse detection model
GNU General Public License v3.0
0 stars 0 forks source link

efficient_hrnet #2

Open jaentrouble opened 3 years ago

jaentrouble commented 3 years ago

Train with efficient_hrnet

jaentrouble commented 3 years ago
  1. ehr11211_1

    • lr: lr_step7_2

    • image_size = (256,384)

    • epoch 100 steps 500

    • batch size 32

jaentrouble commented 3 years ago
  1. ehr11211_2

    • lr: lr_step7_2

    • changed BN_MOMENTUM to 0.9

    • epoch 100 steps 500

    • batch size 32

------------- Abort epoch 74

jaentrouble commented 3 years ago
  1. ~mobv3_small~ ehr11211_3

    • lr: lr_step7_2

    • Change to MobileNetV3Small

    • batch size 32

    • epoch 70 steps 500

EDIT: IT WAS NOT MOBV3!!

jaentrouble commented 3 years ago
  1. ~mobv3_small_block~ ehr11211_block

    • lr: lr_step7_2

    • Train with mini data to detect a block only

      • 0.pck for train, 1.pck for val (~300 for train, ~180 for val)

      • Need more data anyway, so see how it can do with this little data

    • epoch 50 steps 500

EDIT: IT WAS NOT MOBV3!!

jaentrouble commented 3 years ago
  1. mobv3_small_block_2

    • lr: lr_step7_2

    • More data

      • 4.pck for val (2448 for train, 157 for val)
    • epoch 50 steps 500

    • No augment other than flipping for now (Will add next time, if it seems essential)

jaentrouble commented 3 years ago
  1. mobv3_small_block_3

    • lr: lr_step7_2

    • add shift/rotate augmenting

    • add parallel generator

    • batch size 128

    • epoch 100 steps 500

jaentrouble commented 3 years ago
  1. mobv3_small_head

    • lr: lr_step7_2

    • Same as mobv3_small_block_3, but instead of block, track head

    • To see if block shape was the problem of not detecting well, or the model's size

    • epoch 100 steps 500

jaentrouble commented 3 years ago
  1. mobv3_small_head_2

    • lr: lr_step7_2

    • mix data from before (which also has head)

    • Add color augmentation

    • epoch 100, steps 500

jaentrouble commented 3 years ago
  1. mobv3_small_07_head

    • lr: lr_step7_2

    • same data config as 8

    • phi = -2

      • resolution : (288,224)

      • backbone: mobv3_small_07

    • batch size 196

    • epoch 100, steps 327

jaentrouble commented 3 years ago

Quant-aware model

jaentrouble commented 3 years ago
  1. mobv3_small_07_head_q
    • lr: lr_step7_2
    • added data (arduino)
    • other things same as 9
    • batch size 512
    • epoch 100, steps 125

Result: Bad. Try different learning rate

jaentrouble commented 3 years ago
  1. mobv3_small_07_head_q2
    • lr: lr_step8_3
    • Try lower learning rate
    • epoch 50, steps 125
    • batch size 512
    • Lowered rotate limit to 10 degrees
    • other things unchanged

Abort at epoch 36 Result: Worse than 10. Try higher learning rate. Maybe too small model?

jaentrouble commented 3 years ago
  1. mobv3_small_07_head_q3
    • lr: lr_step9
    • Try higher learning rate
    • epoch 50, steps 125
    • batch size 512
    • Other things unchanged
jaentrouble commented 3 years ago

Notes

jaentrouble commented 3 years ago
  1. mobv3_small_07_head_q3q
    • Quantization-aware fine tuning
    • lr: lr_step7_2
    • epoch 20, steps 200
    • batch size 256
    • Load from mobv3_small_07_head_q3/50

Error: Dynamic seems to fail. Try again with full integer(with sample)

jaentrouble commented 3 years ago

Reshape has problems. Workaround: Use concrete function instead.

  1. mobv3_small_07_head_q3q3 (q2 is aborted)

    • Quantization-aware fine tuning with full integer (sample data)
    • qLoad from mobv3_small_07_head_q3q/20 (inplemented qload - load after quant aware)
  2. mobv3_small_07_head_q3q4

    • Quantization-aware fine tuning, quantization without sample data
    • quantization with sample data returns error
jaentrouble commented 3 years ago

---------------------- 4 Chamber ----------------------

jaentrouble commented 3 years ago
  1. mobv3_small_07_head_4cham_0
    • New data (4chamber test + 3led) -> about 8000 frames
    • ~load mobv3_small_07_head_q3/50~ The savefile is 'before' fixing the error! Retrain from the scratch
    • lr: 1e-5 (low_lr)
    • batch 196
    • epoch 100
    • steps 500

Result

jaentrouble commented 3 years ago
  1. mobv3_small_07_head_4cham_0q0