InternLM / lmdeploy

LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
https://lmdeploy.readthedocs.io/en/latest/
Apache License 2.0
3.15k stars 281 forks source link

feat: support llama2 and internlm2 on 910B #1889

Open yao-fengchen opened 2 days ago

yao-fengchen commented 2 days ago

support llama2 and internlm2 on 910B

yao-fengchen commented 2 days ago

I noticed that deeplink_ext is used in the kernels. Shall we need to update it in the requirements.txt file? But I can't find it in https://pypi.org/search/?q=deeplink_ext

We have not yet released the torch_dipu and deeplink_ext packages on pypi. I will provide a method to create an environment on 910B later, and we will also propose an issue for publishing torch_dipu and deeplink_ext packages on pypi in the next version.