Open yangjq713 opened 1 month ago
When I create a LongLLM training environment on an NPU device, the installation of the flash-attention dependency is not possible. Are LongLLM training scripts allowed to be utilized on NPU?
Currently no. You can try use easy_context while using our training data.
When I create a LongLLM training environment on an NPU device, the installation of the flash-attention dependency is not possible. Are LongLLM training scripts allowed to be utilized on NPU?