huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
133.96k stars 26.79k forks source link

Add support for HuBERT batch norm instead of weight norm in pos_conv_emb #34229

Open gallilmaimon opened 1 week ago

gallilmaimon commented 1 week ago

Feature request

Motivation

Your contribution

I can create a PR to implement this, but would love some guidance @ylacombe

avishaiElmakies commented 6 days ago

Hi, would love this feature as well @ylacombe

ylacombe commented 3 days ago

Hey @avishaiElmakies and @gallilmaimon , this would indeed be a great addition.

Would you like to open a PR to correct this ? You'd have to add the possibility to use batch norm in the configuration_hubert.py, propagate the change to the modeling file and the converting file, and finally add an integration test. How does it sound?

Thanks

gallilmaimon commented 2 days ago

@ylacombe Sounds good. I will work on something and let you know when the PR is ready