Closed Rocketknight1 closed 1 year ago
I would love to work on PyTorch Albert
🚀
Hi, I would like to work on PyTorch ImageGPT
Hi, I would like to work on CamemBERT
for PT & TF.
I will take a look at LayoutLMv2
after the first one :smiley:
Edit: Because CamemBert depends on Roberta
I will take PyTorch Roberta
:+1:
Hello!
I'd like to take Hubert
& Wav2Vec2
for Pytorch.
Cheers!
I'll try PyTorch BERT to start!
@johnryan465 I just did it as an example, I'm sorry! I'm marking off the completed models now.
@Rocketknight1 no worries, will try and do DistillBert instead
I'd like to work on GPT2 (TF).
@Rocketknight1 I switch to Roberta PyTorch because CamemBERT depends on Roberta modeling
Awesome! Hey @Rocketknight1 – I'd like to work on Longformer for both PyTorch & TF!
I'd like to work on BigBird
I would like to work on Clip for pytorch.
Also, will work on BeiT
, Deit
and ViT
(Pytorch)
I can work on ImageGPT.
I can work on Swin (Pytorch)
I'd like to work on XLM (Tensorflow)
I'll take T5 (Tensorflow)!
I'd like to claim GPT-2 (PyTorch).
Hi @Rocketknight1,
I would like to work on BART of both TF and PyTorch
ELECTRA TF - https://github.com/huggingface/transformers/pull/16104 ELECTRA PT - https://github.com/huggingface/transformers/pull/16103 DeBERTA PT - https://github.com/huggingface/transformers/pull/16105
XLMRobertaXL (PyTorch)
segformer pytorch
I'll take OpenAIGPT!
Hi @Rocketknight1,
I would like to work on BART of both TF and PyTorch
can you please confirm with emoji whether i am eligible to take these or not? @Rocketknight1
I will work on XLM (PyTorch)
@robotjellyzone You can! Please note that we accepted a PR yesterday to add the TF decorator to BART, so make sure you're working on the most recent version of the library before you start your PR!
I'll take Distilbert (TensorFlow)
Happy to take T5 (PyTorch)
@Rocketknight1 isn't the list missing ConvNext? If so, I'm happy to take care of that one too :ok_hand:
I'll work on GPTJ
@robotjellyzone You can! Please note that we accepted a PR yesterday to add the TF decorator to BART, so make sure you're working on the most recent version of the library before you start your PR!
OK sure! I will keep this in mind 😊👍...
I'll take Splinter and Segformer Rembert for torch
Edit: @p-mishra1 has Segformer. Taking Rembert instead
Looks like ImageGPT was done. I can take Luke in PyTorch.
I'd like to take PoolFormer
I'm going for FlauBERT
now !
I'll work on FNet for PyTorch.
Hello, I will work on SqueezeBERT for Pytorch!
I will also work on GPTNeo for Pytorch
I'd like to work on Perceiver for torch
I will also work on Pegasus for pytorch
I will work on XLMRoberta for TF
I will work on YOSO for PT
Hi, I'll take Marian (Pytorch)
Hi, I will work on RAG(pytorch).
Nevermind, XLMRoberta relies entirely on Roberta (for TF) I will work on Reformer instead!
Hey, I would like to work on the BigBirdPegasus model of Pytorch.
Hey, I am looking into mBART model for TF and PyTorch implementations. If anyone, interested do let me know.
I will work on XLNet for TF and PT
Happy to take CTRL and MPNet for Tensorflow
I'm working on MobileBert for both TensorFlow & PyTorch.
Hi, I'd like to take OpenAIGPT (PyTorch).
This issue is part of our Great Code Cleanup 2022. If you're interested in helping out, take a look at this thread, or come join us on Discord and talk with other contributors!
🚀 Add missing type hints
Type hints are used inconsistently in the
transformers
repo across both TF and PT models, and it'd be nice to make them a complete, consistent thing for the core models, especially because we want to develop features that depend on them!Guide to contributing:
make fixup
at the end to do a code quality check before your final commit!Tips for making your PR
src/transformers/models/[model_name]/
modeling_tf_[model_name].py
file. For PyTorch, you want themodeling_[model_name].py
file.call
(for TF) orforward
(for PT) method for user-facing classes likeTFRobertaForMaskedLM
orRobertaForSequenceClassification
. It's not necessary to add type hints to layers or base classes likeRobertaModel
orTFRobertaPreTrainedModel
- these are trickier to write, and generally people do not use those classes as standalone models.Optional[Union[np.ndarray, tf.Tensor]]
for TF models andOptional[torch.Tensor]
for PyTorch models, and boolean inputs areOptional[bool]
. Pay attention to the first input of TF models, though, which is usuallyTFModelInputType
- this is because Keras handles that first input in a special way! Other inputs to pay attention to arepast_key_values
, which can vary between models, and also the model output type. For the base model classes likeRobertaModel
, you may have to look at the correspondingMainLayer
to figure out the right output type! Also, note that the output type may be a tuple ifreturn_dict
is False, in which case you should specifyUnion[Tuple, ...]
. Finally, note that in TF models,training
is neverNone
, so it should betraining: bool
and nottraining: Optional[bool]
.# Copied from transformers.models.bert...
, this means that the code is copied from that source, and our scripts will automatically keep that in sync. If you see that, you should not edit the copied method! Instead, edit the original method it's copied from, and runmake fixup
to synchronize that across all the copies. Be sure you installed the development dependencies withpip install -e ".[dev"]
, as described in the contributor guidelines above, to ensure that the code quality tools inmake fixup
can run.How can I find models that need type hints?
I used to maintain a list here, but it got out of date, I'm sorry. Instead, you can use this Colab notebook. If you run this, it will show you models in PyTorch or TF that are still missing type hints. Unlike my manually curated lists, it's guaranteed to be up to date - but do double-check that someone else in the thread hasn't claimed a model before you start, because the Colab code will only register type hints after the PR containing them is merged!