pengzhiliang / MAE-pytorch

Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners
2.59k stars 341 forks source link

typo error local-rank #101

Open Jalilnkh opened 7 months ago

Jalilnkh commented 7 months ago

Running your code I faced with this problem. I check your code and see that you have local_rank instead. So please correct this error.

usage: MAE pre-training script [--batch_size BATCH_SIZE] [--epochs EPOCHS] [--save_ckpt_freq SAVE_CKPT_FREQ]
                               [--model MODEL] [--mask_ratio MASK_RATIO] [--input_size INPUT_SIZE] [--drop_path PCT]
                               [--normlize_target NORMLIZE_TARGET] [--opt OPTIMIZER] [--opt_eps EPSILON]
                               [--opt_betas BETA [BETA ...]] [--clip_grad NORM] [--momentum M]
                               [--weight_decay WEIGHT_DECAY] [--weight_decay_end WEIGHT_DECAY_END] [--lr LR]
                               [--warmup_lr LR] [--min_lr LR] [--warmup_epochs N] [--warmup_steps N]
                               [--color_jitter PCT] [--train_interpolation TRAIN_INTERPOLATION] [--data_path DATA_PATH]
                               [--imagenet_default_mean_and_std] [--output_dir OUTPUT_DIR] [--log_dir LOG_DIR]
                               [--device DEVICE] [--seed SEED] [--resume RESUME] [--auto_resume] [--no_auto_resume]
                               [--start_epoch N] [--num_workers NUM_WORKERS] [--pin_mem] [--no_pin_mem]
                               [--world_size WORLD_SIZE] [--local_rank LOCAL_RANK] [--dist_on_itp]
                               [--dist_url DIST_URL]
MAE pre-training script: error: unrecognized arguments: --local-rank=2
EliuciM commented 6 months ago

All you need to do is add ' --use-env' before your python file. Like this:

~/miniconda3/envs/mae/lib/python3.11/site-packages/torch/distributed/launch.py --nproc_per_node=1 --use_env ./run_mae_pretraining.py --data_path "" --mask_ratio 0.75 --model pretrain_mae_base_patch16_224 --batch_size 8 --opt adamw --opt_betas 0.9 0.95 --warmup_epochs 40 --epochs 1600 --output_dir ""