Closed dhairyadalal closed 1 year ago
It's great to hear you like it :).
I plan to work on a demo paper about this repo once I have some spare time and will update it once it's ready.
Until then, you can cite my latest paper (also on efficient LLM!):
@misc{nawrot2022dynamic,
title={Efficient Transformers with Dynamic Token Pooling},
author={Piotr Nawrot and Jan Chorowski and Adrian Łańcucki and Edoardo M. Ponti},
year={2022},
eprint={2211.09761},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
I used Zenodo to add a citing possibility. You should see "Cite this repository" on the right, main branch.
Thanks for the clarification!
I released a demo paper on arxiv and submitted it to EMNLP conf, if you found the repo useful you could update your citation : ) Thanks!
Thanks for sharing your code and optimizations for training T5. What's the best way to cite this repo?