Closed ydshieh closed 1 year ago
@ydshieh that's because there was "Fixes https://github.com/huggingface/transformers/issues/19487" in the description of the PR :)
"Fixes", like "close" or "fix" will close the issue when the PR is merged.
I would take gpt_neo
, gpt_neox_japanese
and gpt_neox
I'll take on
I would like to work on openai
and opt
I will take mbart
and mctct
I will work on layoutlm
, layoutlmv2
, layoutlmv3
I will work on ELECTRA
I will work on PoolFormer
I will work on PLBART
I will work on Nezha
I'll take maskformer
Hi, Can I have LayoutLMv2 and BERT
Hi, Can I have LayoutLMv2 and BERT
Hi @rushic24 They have been done. You can check this file and find other config files to work with π€
I'll take fsmt next
While browsing the list of model configurations, I noticed that the DebertaConfig class does not have an example docstring section. Unsure if that is supposed to be like that, but just incase its not, I will add a PR to include the example docstring and maybe I can get some feedback from there.
I'll work on dpt
DebertaConfig
That would be very nice, @Saad135 ! Thank you
I will take DeBERTa-v2 next
I can take camembert next
I can take DPR next
I can take DeformableDetrConfig next
Can I take timesformer next?
Can I take timesformer next?
Sure! For the context, we decide not to use the tiny random model checkpoints anymore. If there are some downstream models which lack the checkpoint, we just not to provide the expected values.
Hello, I would like to take on gptj, longformer, and hubert
@ydshieh , may I share a list of models that are yet to be worked on?
@elabongaatuo GPT-J is large, and our CI won't be able to run doctest with its checkpoints.
I think gptj, longformer, and hubert are all covered in
Feel free to check the modeling files that are not in the above file π€ if you want to work on it β€οΈ . Thank you!
@ydshieh , thank you. m2m_100,llama and mvp don't have modeling files. a go ahead to work on them?
llama
has no publicly available checkpoints on the Hub - no need to work on it.
For the other 2 files, you can run doctest against them. If they pass, you can simply add them to documentation_tests.txt
.
Otherwise, we can discuss how to deal with the errors :-).
Hi @ydshieh , I am new to open source, so just wanted to confirm whether I can take Falcon or not? Config file of Falcon is not mentioned in the documentation_tests.txt file.
Hello @ydshieh , I am new to open sorce and want to take barthez
. If my contributions are successful, I'm eager to extend my involvement to other models as well. Looking forward to a productive and enduring journey of contributions!
Edit: I couldn't find configuration files for barthez
. Any help is appreciated!
I'll take roformer #26530
This sprint is similar to #16292 - but for model configuration files, i.e.
configuration_[model_name].py
. For example,src/transformers/models/bert/configuration_bert.py
The expected changes
The changes we expect could be find #19485:
(with random weights)
in the comment before model initialization lineconfiguration_[model_name].py
toutils/documentation_tests.txt
(respecting the order)Please do step 3. only after Running the doctest and make sure all tests pass (see below) π
How to run doctests
Suppose you are working on
src/transformers/models/bert/configuration_bert.py
. The steps to run the test are:Stage your changes
Prepare the files to be tested
or if you prefer to be more specific
This will change some files (doc-testing needs to add additional lines that we don't include in the doc source files).
to clean up the changes in step 1.
Ready (or not)?
If all tests pass, you can commit, push and open a PR π₯ π , otherwise iterate the above steps π― !