intel / llm-on-ray

Pretrain, finetune and serve LLMs on Intel platforms with Ray
Apache License 2.0
103 stars 30 forks source link

[Inference]apt-get intel-oneapi-ccl-devel-2021.12 intel-oneapi-compiler-dpcpp-cpp-runtime-2024.1 to fix deepspeed inference error. #263

Closed minmingzhu closed 4 months ago