keras-team / keras-nlp

Modular Natural Language Processing workflows with Keras
Apache License 2.0
734 stars 216 forks source link

Integrate ELECTRA Pretrained Model into Keras_NLP Ecosystem #1281

Open pranavvp16 opened 8 months ago

pranavvp16 commented 8 months ago

I would like to add ELECTRA pretrained model in the keras_nlp ecosystem. I have went through the CONTRIBUTINT_MODELS.md and making an issue is the first task to add the model. I have pretty good understanding of transformers and have went through the ELECTRA architecture mentioned here. So if there is no problem regarding the integration I can start working on this !!

mattdangerw commented 8 months ago

Electra would be a welcome addition! Thanks! As mentioned in the model contribution guide, first thing to do would be to write a backbone and a checkpoint conversion script from a known source that shows we can match outputs.

Huggingface might be the easiest checkpoint source. The original electra repo would be totally fine as well, but looks like the implementation is in tf1, which would be a little painful to work with.

pranavvp16 commented 8 months ago

Thanks for assigning the issue, I'll start working on it

shivance commented 8 months ago

@mattdangerw I had one Colab floating around which would implement electra in KerasNLP, will share if I find . Would be helpful for @pranavvp16

shivance commented 8 months ago

794

pranavvp16 commented 8 months ago

@shivance thanks for the notebook but I think the notebook shares the pre-training approach of ELECTRA while I'm trying to implement the backbone of ELECTRA which is similar to BERT with some changes. Please let me know if I'm wrong here cause I'm pretty new to this thing. I have implemented the backbone successfully in keras according to the ELECTRA architecture, and working of weight conversion from hugging face checkpoint

mattdangerw commented 8 months ago

I'm trying to implement the backbone of ELECTRA which is similar to BERT with some changes. Please let me know if I'm wrong here cause I'm pretty new to this thing. I have implemented the backbone successfully in keras according to the ELECTRA architecture, and working of weight conversion from hugging face checkpoint

Yes that's definitely a good approach! Let's start with the backbone, not the pretraining approach.

The pretraining approach would make for an excellent keras.io example (which was https://github.com/keras-team/keras-nlp/issues/794), but beginning with a backbone is the right first step on adding this model to our API.

Will take a look at the issues on the PR shortly!