sinovation / ZEN

A BERT-based Chinese Text Encoder Enhanced by N-gram Representations
Apache License 2.0
641 stars 104 forks source link

about pre-training time #3

Closed pluto-junzeng closed 4 years ago

pluto-junzeng commented 4 years ago

我们有相同的配置NVIDIA Tesla V100 GPUs with 16GB memory,打算换切换为百度百科进行预训练,请问 1epoch 大概需要多久? We have the same configuration of NVIDIA Tesla V100 GPUs with 16GB memory, we plan to switch to baidu baike for pre-training, may I ask how long it will take for 1 epoch?

GuiminChen commented 4 years ago

It depends on the size of your corpus. For your reference, it costs 3-5 hours for one epoch for pre-training with the mixed precision training on our corpus.

pluto-junzeng commented 4 years ago

Thanks, Have a nice day!