kkoutini / PaSST

Efficient Training of Audio Transformers with Patchout
Apache License 2.0
287 stars 48 forks source link

Fixing weights for fine-tuning? #44

Closed Antoine101 closed 3 months ago

Antoine101 commented 4 months ago

Hi Khaled,

Do you fix weights of embeddings and attention blocks after loading pretrained checkpoints for finetuning, or is it just an initialization and they are further updated through finetuning? I can't really find the answer in your code.

Many thanks.

kkoutini commented 3 months ago

Hi Antoine, Sorry for the late reply. For fine-tuning, I do not freeze any parameters, I load the pretrained model and fine-tune it for very few epochs.

Antoine101 commented 3 months ago

No worries! Ok great, thanks for the clarification!