clovaai / bros

Apache License 2.0
155 stars 23 forks source link

Suggestions for implement pre-training #12

Open WeihongM opened 2 years ago

WeihongM commented 2 years ago

Thanks for your impressive work. Can you share how to implement pretraining code?

tghong commented 2 years ago

Thank you for your interest in our work! We have no plans to provide a pre-training code yet. You can implement your own pre-training script by referring the Hugging Face's transformers code: https://github.com/huggingface/transformers/blob/v4.19.2/src/transformers/data/data_collator.py#L748