Hi there i just came across facebook/mms-1b-fl102 on Huggingface
a mms-1b model finetuned on fleur dataset
I have reasonably large amounts of data on multiple low resources langauges which i believe would further lower the wer on those langauges post finetuning.
But i am unable to understand how to go about finetuning multiple languages.
could you please share the links/resources of how to reproduce such multi-language finetuning
Hi there i just came across facebook/mms-1b-fl102 on Huggingface a mms-1b model finetuned on fleur dataset
I have reasonably large amounts of data on multiple low resources langauges which i believe would further lower the wer on those langauges post finetuning.
But i am unable to understand how to go about finetuning multiple languages.
could you please share the links/resources of how to reproduce such multi-language finetuning