huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
134.34k stars 26.86k forks source link

Finetuning Whisper with adapter with MAML #28238

Open LYPinASR opened 10 months ago

LYPinASR commented 10 months ago

Feature request

MAML is a widely used meta-learning method for reinitializing model parameters, which can effectively cope with low-resource situations. As Whisper is a pre-trained model, its parameters cannot be reinitialized, but the bottleneck structure adapter can be applied to the encoder and decoder layers of the model, and then the adapter can be trained using MAML. Request code for fine-tuning Whisper with adapter using MAML such as meta-training with 6 languages and final fine-tuning with 4 other languages.

Motivation

low-resource ASR.

Your contribution

Anything you need and I can.

amyeroberts commented 9 months ago

cc @sanchit-gandhi @ylacombe

ylacombe commented 5 months ago

Hey @LYPinASR, not sure to understand what exactly is the requested feature? Do you have any code or paper pointers ? Thanks!