MAML is a widely used meta-learning method for reinitializing model parameters, which can effectively cope with low-resource situations.
As Whisper is a pre-trained model, its parameters cannot be reinitialized, but the bottleneck structure adapter can be applied to the encoder and decoder layers of the model, and then the adapter can be trained using MAML.
Request code for fine-tuning Whisper with adapter using MAML such as meta-training with 6 languages and final fine-tuning with 4 other languages.
Feature request
MAML is a widely used meta-learning method for reinitializing model parameters, which can effectively cope with low-resource situations. As Whisper is a pre-trained model, its parameters cannot be reinitialized, but the bottleneck structure adapter can be applied to the encoder and decoder layers of the model, and then the adapter can be trained using MAML. Request code for fine-tuning Whisper with adapter using MAML such as meta-training with 6 languages and final fine-tuning with 4 other languages.
Motivation
low-resource ASR.
Your contribution
Anything you need and I can.