Open abcdchop opened 4 years ago
Hi, MobileBERT is implemented in Huggingface transformers. So I expect you can just load it and use it for training.
See: https://www.sbert.net/docs/training/overview.html#creating-networks-from-scratch
Best Nils Reimers
Hi guys! I've found your code to be really fantastic
I'm interested in training a Mobilebert model using your fine-tuning paradigm. I know you don't offer Mobilebert at present, but i was wondering if its possible port over a model from regular transformers and use the same scheme, or if there are specific modifications i could look at to make this viable. Thanks!