rxn4chemistry / rxnmapper

RXNMapper: Unsupervised attention-guided atom-mapping. Code complementing our Science Advances publication on "Extraction of organic chemistry grammar from unsupervised learning of chemical reactions" (https://advances.sciencemag.org/content/7/15/eabe4166).
http://rxnmapper.ai
MIT License
286 stars 68 forks source link

what's the value of the per_gpu_train_batch_size in the process of trainning the model? #3

Closed autodataming closed 4 years ago

autodataming commented 4 years ago
training_args = TrainingArguments(
    output_dir="./",
    overwrite_output_dir=True,
    num_train_epochs=5,
    per_gpu_train_batch_size=8,  
    save_steps=10_000,
    save_total_limit=2,
)

what's the value of the per_gpu_train_batch_size when train the model?

pschwllr commented 4 years ago

We used 16 as per_gpu_train_batch_size and trained on a single GPU.