facebookresearch / msn

Masked Siamese Networks for Label-Efficient Learning (https://arxiv.org/abs/2204.07141)
Other
447 stars 34 forks source link

can help add vit-small-8 and vit-base-4 config file? #5

Open ywdong opened 2 years ago

ywdong commented 2 years ago

can help add vit-small-8 and vit-base-4 config file?

MidoAssran commented 2 years ago

Hi @ywdong,

Here is the config for the ViT-B/4, which can be run on 48 AWS p4d-24xlarge machines. We don't have a ViT-S/8 run with the current version of MSN, but will release all the configs shortly and let you know once they're available!

criterion:
  ent_weight: 0.0
  final_sharpen: 0.25
  me_max: true
  memax_weight: 1.0
  num_proto: 1024
  start_sharpen: 0.25
  temperature: 0.1
  batch_size: 3
  use_ent: true
  use_sinkhorn: true
data:
  color_jitter_strength: 0.5
  pin_mem: false
  num_workers: 0
  image_folder: imagenet_full_size/061417/
  label_smoothing: 0.0
  patch_drop: 0.7
  rand_size: 224
  focal_size: 96
  rand_views: 1
  focal_views: 10
  root_path: /datasets/
logging:
  folder: /path_to_save_vitb4_logs/
  write_tag: msn
meta:
  bottleneck: 1
  copy_data: false
  drop_path_rate: 0.0
  hidden_dim: 2048
  load_checkpoint: false
  model_name: deit_base_p4
  output_dim: 256
  read_checkpoint: null
  use_bn: true
  use_fp16: false
  use_pred_head: false
optimization:
  clip_grad: 3.0
  epochs: 400
  final_lr: 1.0e-06
  final_weight_decay: 0.4
  lr: 0.001
  start_lr: 0.0002
  warmup: 15
  weight_decay: 0.04