Open cbalioglu opened 4 months ago
@cbalioglu any progress on pre-train or finetune w2v bert recipe?
Hi @cbalioglu, Any update on this... We want to do the continual training of w2v-bert on specific Indic low-resource languages on audio-only data. Any suggestion on how should we approach this?
As the second recipe after NLLB, write the w2v-BERT (and wav2vec2) pretraining recipe for users to check out. This will likely branch to several subtasks once we start working on it.