issues
search
mit-han-lab
/
streaming-llm
[ICLR 2024] Efficient Streaming Language Models with Attention Sinks
https://arxiv.org/abs/2309.17453
MIT License
6.59k
stars
361
forks
source link
Added requirements.txt with pinned package versions
#4
Open
KarimJedda
opened
1 year ago
KarimJedda
commented
1 year ago
Works also under python3.10.
kustomzone
commented
1 year ago
typo in readme: setyp.py --> setup.py
Works also under python3.10.