Open yao-fengchen opened 2 days ago
I noticed that deeplink_ext is used in the kernels. Shall we need to update it in the
requirements.txt
file? But I can't find it in https://pypi.org/search/?q=deeplink_ext
We have not yet released the torch_dipu and deeplink_ext packages on pypi. I will provide a method to create an environment on 910B later, and we will also propose an issue for publishing torch_dipu and deeplink_ext packages on pypi in the next version.
support llama2 and internlm2 on 910B