Open StephennFernandes opened 11 months ago
Hi Stephenn, I found the pretraining code for X-MOD. I hope this helps. https://github.com/facebookresearch/fairseq/tree/main/fairseq/models/xmod
@razzzeeev thanks for the update. actually i wanted to look into more specific steps on how to pretrain it from scratch, how to initialize the adapter modules during pretraining, how to setup the dataloader in such a way that the samples along with lang ID's route to specific adapter modules. i was hoping there could be an actual fairseq script to launch the pretraining.
hey there, i am trying to reproduce X-MOD model from scratch with the premise of understanding language specific modular training better, thus to pretrain other models like spanBERT, BART, LLAMA type models with X-MOD style moudles. where can i find the pretraining code ?