uark-cviu / Micron-BERT

[CVPR 2023] Micron-BERT: BERT-based Facial Micro-Expression Recognition
126 stars 10 forks source link

Can the author give the issues in this repo an answer????? #13

Open HZIH opened 11 months ago

HZIH commented 11 months ago

I have 5 following problems

  1. Can the result in the essay be reproduced?
  2. In your essay, you mentioned you use MSE to constrain the POI, however, I find that you use dino loss instead
  3. The pretrained model you provided is trained on casme2, however, your essay mentions your model is pretrained on casme3. Moreover, the pretrained model cannot produce a nice feature map that locate the micro expression area and the model doesn't use the DMA module.
  4. You haven't provided the preprocess method of the face, which may result bad performance.
  5. Lots of issues in this repo, can you give an answer?

If you are afraid of bad use of your pretrained model, then just show us the other parts of the code or answer our questions, don't bury your head like an ostrich.

HZIH commented 11 months ago

The following is the args of the pretrained model provided:

Namespace(att_loss=False, aux_cls=False, batch_size_per_gpu=16, casme2_path='data/CASME2-RAW/', dataset='CASME2', decoder_depth=4, decoder_embed_dim=512, decoder_num_heads=4, depth=4, diag_att=False, dist_url='env://', distributed=True, drop_rate=0.0, ema_decay=0.997, embed_dim=512, enable_dino=True, epochs=200, gpu=0, has_decoder=True, img_size=224, local_crops_number=8, local_rank=0, lr=0.00025, min_lr=0.0, model_name='mae_vit_small_patch8', num_heads=4, num_workers=4, out_dim=128, output_dir='logs/mae_vit_small_patch8/CASME2-is224-p8-b16-ep200-0.5-mae-dino-2', patch_size=8, pretrained_checkpoint='', pretrained_encoder='dummy', rank=0, replace_ratio=0.5, resume='', samm_path='data/SAMM/', saveckp_freq=5, scheduler='cosine', seed=216, segmentation=False, teacher_temp=0.04, use_ema=False, use_fp16=False, warmup_teacher_temp=0.04, warmup_teacher_temp_epochs=0, weight_decay=0.0001, world_size=16)