TomWagg / software-citation-station

A website for making citing software used in your research quick and easy
22 stars 2 forks source link

[NEW SUBMISSION] transformers #19

Closed cakiki closed 3 weeks ago

cakiki commented 3 weeks ago

Citation information

"transformers": {
    "tags": [
        "wolf-etal-2020-transformers"
    ],
    "logo": "img/transformers.png",
    "language": "Python",
    "category": "Machine Learning",
    "keywords": [
        "python",
        "nlp",
        "machine-learning",
        "natural-language-processing",
        "deep-learning",
        "tensorflow",
        "pytorch",
        "transformer",
        "speech-recognition",
        "seq2seq",
        "flax",
        "pretrained-models",
        "language-models",
        "nlp-library",
        "language-model",
        "hacktoberfest",
        "bert",
        "jax",
        "pytorch-transformers",
        "model-hub"
    ],
    "description": "Transformers provides thousands of pretrained models to perform tasks on different modalities such as text, vision, and audio.",
    "link": "https://huggingface.co/docs/transformers/index",
    "attribution_link": "https://github.com/huggingface/transformers?tab=readme-ov-file#citation",
    "zenodo_doi": "",
    "custom_citation": "",
    "dependencies": []
}

BibTeX

@inproceedings{wolf-etal-2020-transformers,
    title = "Transformers: State-of-the-Art Natural Language Processing",
    author = "Thomas Wolf and Lysandre Debut and Victor Sanh and Julien Chaumond and Clement Delangue and Anthony Moi and Pierric Cistac and Tim Rault and Rémi Louf and Morgan Funtowicz and Joe Davison and Sam Shleifer and Patrick von Platen and Clara Ma and Yacine Jernite and Julien Plu and Canwen Xu and Teven Le Scao and Sylvain Gugger and Mariama Drame and Quentin Lhoest and Alexander M. Rush",
    booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
    month = oct,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2020.emnlp-demos.6",
    pages = "38--45"
}

Logo

Brand assets found here: https://huggingface.co/brand

Not sure which would be the best logo for the tool. The repo has a transformers specific logo but that's not square. I'd also like to add the rest of the Hugging Face libraries, so I'm not sure how best to handle differentiating those visually.

TomWagg commented 3 weeks ago

Thanks @cakiki, this looks good, I've added one of the brand logos. For the rest of the libraries, though different logos would definitely be preferable, I think it's okay to have the same logo and a different title (we already have that with e.g. astropy and astroquery).