Traceback (most recent call last): File "e:\llm\TinyLlama\pretrain\tinyllama.py", line 17, in <module> from lit_gpt.model import GPT, Block, Config, CausalSelfAttention File "E:\llm\TinyLlama\lit_gpt\__init__.py", line 1, in <module> from lit_gpt.model import GPT File "E:\llm\TinyLlama\lit_gpt\model.py", line 13, in <module> from flash_attn import flash_attn_func ModuleNotFoundError: No module named 'flash_attn'
Traceback (most recent call last): File "e:\llm\TinyLlama\pretrain\tinyllama.py", line 17, in <module> from lit_gpt.model import GPT, Block, Config, CausalSelfAttention File "E:\llm\TinyLlama\lit_gpt\__init__.py", line 1, in <module> from lit_gpt.model import GPT File "E:\llm\TinyLlama\lit_gpt\model.py", line 13, in <module> from flash_attn import flash_attn_func ModuleNotFoundError: No module named 'flash_attn'