Hao840 / OFAKD

PyTorch code and checkpoints release for OFA-KD: https://arxiv.org/abs/2310.19444
81 stars 11 forks source link

Requirements for log in ImageNet #15

Closed liguopeng0923 closed 6 months ago

liguopeng0923 commented 6 months ago

Hi, @Hao840 ,

Can you give me the log file and "args.yaml" of the distilled processes in ImageNet (T: DeiT-T, S: ResNet18)?

Hao840 commented 6 months ago

the log can be found in the released file, and the args.yaml is presented as follows:

aa: null
amp: true
apex_amp: false
aug_repeats: 0
aug_splits: 0
batch_size: 64
bce_loss: false
bce_target_thresh: null
bn_eps: null
bn_momentum: null
channels_last: false
checkpoint_hist: 10
class_map: ''
clip_grad: null
clip_mode: norm
color_jitter: 0.0
cooldown_epochs: 0
crop_pct: null
cutmix: 0.0
cutmix_minmax: null
data_dir: /cache/data/imagenet/
dataset: ''
dataset_download: false
decay_epochs: 30.0
decay_rate: 0.1
disguise: false
dist_bn: reduce
drop: 0.0
drop_block: null
drop_connect: null
drop_path: 0.0
epoch_repeats: 0.0
epochs: 100
eps:
- 1.5
eval_metric: top1
experiment: ''
fuser: ''
gp: null
grad_checkpointing: false
hflip: 0.5
img_size: null
initial_checkpoint: ''
input_size: null
interpolation: ''
jsd_loss: false
kd_criterion: kl
layer_decay: null
local_rank: 0
log_interval: 200
log_wandb: false
lr: 0.1
lr_cycle_decay: 0.5
lr_cycle_limit: 1
lr_cycle_mul: 1.0
lr_k_decay: 1.0
lr_noise: null
lr_noise_pct: 0.67
lr_noise_std: 1.0
mean: null
min_lr: 1.0e-06
mixup: 0.0
mixup_mode: batch
mixup_off_epoch: 0
mixup_prob: 1.0
mixup_switch_prob: 0.5
model: resnet18
model_ema: false
model_ema_decay: 0.9998
model_ema_force_cpu: false
momentum: 0.9
native_amp: false
no_aug: false
no_ddp_bb: false
no_prefetcher: false
no_resume_opt: false
num_classes: 1000
ofa_name: ofaproj
opt: sgd
opt_betas: null
opt_eps: null
order:
- 1
output: /cache/output/haozhiwei/
patience_epochs: 10
pin_mem: false
pretrained: false
ratio:
- 0.75
- 1.3333333333333333
recount: 1
recovery_interval: 0
remode: pixel
reprob: 0.0
resplit: false
resume: ''
save_images: false
scale:
- 0.08
- 1.0
scale_norm: false
sched: step
seed: 42
smoothing: 0.1
split_bn: false
stage:
- 1
- 2
- 3
- 4
start_epoch: null
state_key: none
std: null
sync_bn: false
teacher: deit_tiny_patch16_224
teacher_ckpt: /cache/haozhiwei/ckpt/BigKD/deit_tiny_patch16_224.pth
temperature: 1.0
torchscript: false
train_interpolation: random
train_split: train
tta: 0
use_multi_epochs_loader: false
val_split: validation
validation_batch_size: null
vflip: 0.0
warmup_epochs: 3
warmup_lr: 1.0e-06
weight_decay: 0.0001
weight_feat: 0.1
weight_gt: 1.0
weight_kd: 1.0
weight_ofa: 1.0
worker_seeding: all
workers: 8
liguopeng0923 commented 6 months ago

Thanks!