hpcaitech / ColossalAI

Making large AI models cheaper, faster and more accessible
https://www.colossalai.org
Apache License 2.0
38.63k stars 4.33k forks source link

[BUG]: No module named 'dropout_layer_norm' #5726

Open apachemycat opened 4 months ago

apachemycat commented 4 months ago

Is there an existing issue for this bug?

🐛 Describe the bug

ModuleNotFoundError: No module named 'dropout_layer_norm' [2024-05-17 03:23:11,932] torch.distributed.elastic.multiprocessing.api: [ERROR] failed (exitcode: 1) local_rank: 0 (pid: 615) of binary: /usr/bin/python

dropout_layer_norm is depreated by flash_attn ,so If any other choise ?

Environment

No response

duanjunwen commented 3 months ago

Hi @apachemycat , would you mind sharing the version of flash_atten in your environment? I am using flash-attn==2.5.7 , looks all good. Also, you can replace dropout_layer_norm with torch.nn.functional.layer_norm & dropout, although kernel acceleration may not supported now.

zhurunhua commented 2 months ago

watching...