FlagOpen / FlagEmbedding

Retrieval and Retrieval-augmented LLMs
MIT License
6.69k stars 479 forks source link

How to implement LongLLM in NPU device #957

Open yangjq713 opened 1 month ago

yangjq713 commented 1 month ago

When I create a LongLLM training environment on an NPU device, the installation of the flash-attention dependency is not possible. Are LongLLM training scripts allowed to be utilized on NPU?

namespace-Pt commented 1 month ago

Currently no. You can try use easy_context while using our training data.