jzhang38 / TinyLlama

The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Apache License 2.0
7.3k stars 425 forks source link

Help me pls #168

Open aritralegndery opened 3 months ago

aritralegndery commented 3 months ago

Traceback (most recent call last): File "e:\llm\TinyLlama\pretrain\tinyllama.py", line 17, in <module> from lit_gpt.model import GPT, Block, Config, CausalSelfAttention File "E:\llm\TinyLlama\lit_gpt\__init__.py", line 1, in <module> from lit_gpt.model import GPT File "E:\llm\TinyLlama\lit_gpt\model.py", line 13, in <module> from flash_attn import flash_attn_func ModuleNotFoundError: No module named 'flash_attn'

jzhang38 commented 3 months ago

Follow the instructions at https://github.com/Dao-AILab/flash-attention to install flash attn.

aritralegndery commented 3 months ago

Follow the instructions at https://github.com/Dao-AILab/flash-attention to install flash attn.

Thank You Sir.