Closed wenhui-huang closed 2 months ago
Hi @wenhui-huang , the issue is because of the patch script not being pushed to the repo. You can pull the changes and try again.
Note: Now with the recent updates to the transformers package, support for flash attention 2 comes inbuilt. I will be rolling out an upgrade in the coming days.
I use 'run_pt.sh' to pretrain llama2-7b. The following is the error message I encountered. Is it because the dependent packages are not installed?
pip list: