derlem / kanarya

A deep learning model for classification of 'de/da' clitics in Turkish
21 stars 3 forks source link

Run BERT pretraining on TRUBA #18

Closed haozturk closed 3 years ago

haozturk commented 4 years ago

We have run BERT pretraining on our machines with small number of training steps successfully. Now it is time to run it on a more powerful grid, TRUBA, to get a much more effective result. All the data required for BERT pretraining is ready.

haozturk commented 4 years ago

I have started the transfer of tfrecords needed for BERT pretraining from Minerva to TRUBA. The records are about 150 GB, it will take some time. Therefore status is currently blocked.

onurgu commented 4 years ago

I think we can also close this issue, did we follow up on this issue? I haven't heard about this for a while? @haozturk

onurgu commented 3 years ago

gereksizleşti. kapatıyorum.