PiotrNawrot / nanoT5

Fast & Simple repository for pre-training and fine-tuning T5-style models
Apache License 2.0
971 stars 74 forks source link

Citing Repo #1

Closed dhairyadalal closed 1 year ago

dhairyadalal commented 1 year ago

Thanks for sharing your code and optimizations for training T5. What's the best way to cite this repo?

PiotrNawrot commented 1 year ago

It's great to hear you like it :).

I plan to work on a demo paper about this repo once I have some spare time and will update it once it's ready.

Until then, you can cite my latest paper (also on efficient LLM!):

@misc{nawrot2022dynamic,
      title={Efficient Transformers with Dynamic Token Pooling},
      author={Piotr Nawrot and Jan Chorowski and Adrian Łańcucki and Edoardo M. Ponti},
      year={2022},
      eprint={2211.09761},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
PiotrNawrot commented 1 year ago

I used Zenodo to add a citing possibility. You should see "Cite this repository" on the right, main branch.

dhairyadalal commented 1 year ago

Thanks for the clarification!

PiotrNawrot commented 1 year ago

I released a demo paper on arxiv and submitted it to EMNLP conf, if you found the repo useful you could update your citation : ) Thanks!