yuhuixu1993 / qa-lora

Official PyTorch implementation of QA-LoRA
MIT License
110 stars 12 forks source link

How to set batchsize #18

Open StiphyJay opened 9 months ago

StiphyJay commented 9 months ago

Thanks for your great job. In your paper, the batch size is 16 in the tunning, how to set the batchsize as 16, change per_device_train_batch_size value from default 1 as 16?

yuhuixu1993 commented 9 months ago

Thanks for your great job. In your paper, the batch size is 16 in the tunning, how to set the batchsize as 16, change per_device_train_batch_size value from default 1 as 16?

Hi,please check the hyper-parameter gradient_accumulation_steps