staoxiao / RetroMAE

Codebase for RetroMAE and beyond.
Apache License 2.0
237 stars 18 forks source link

one GPU with zero GPU-util #32

Open Gavinthisisit opened 4 months ago

Gavinthisisit commented 4 months ago

I have only one 4090D,when i try do pretrain on TinyBert run: torchrun --nproc_per_node 1 \ -m pretrain.run \ --output_dir output/pretrain_model \ --data_dir /root/autodl-tmp/RetroMAE/src/pretrain_data/book_data \ --do_train True \ --save_steps 2000 \ --per_device_train_batch_size 30 \ --model_name_or_path /root/autodl-tmp/tiny-bert-sst2-distilled \ --pretrain_method retromae \ --fp16 True \ --warmup_ratio 0.1 \ --learning_rate 1e-4 \ --num_train_epochs 1 \ --overwrite_output_dir True \ --dataloader_num_workers 6 \ --weight_decay 0.01 \ --encoder_mlm_probability 0.3 \ --decoder_mlm_probability 0.5

training process can run but with very low speed.

image

GPU info:

image

any one know what is wrong?