RulinShao / LightSeq

Official repository for LightSeq: Sequence Level Parallelism for Distributed Training of Long Context Transformers
169 stars 8 forks source link

pip requirements #11

Open kuangdao opened 1 month ago

kuangdao commented 1 month ago

can provide pip requirements

AshkanFeyzollahi commented 1 month ago

I have generated a requirements.txt file for this project, i hope you can get all requirements using below thing to use this project

einops==0.8.0
flash_attn==2.5.9.post1
numpy==1.25.0
pytest==8.2.2
torch==2.1.0
tqdm==4.65.2
transformers==4.34.0
triton==2.3.1
xformers==0.0.26.post1

also i gonna open a pull request for this