Open Luke20000429 opened 2 weeks ago
The LlamaFlashAttention2 has no __init__(). As a result running the system with flash-attn will crash. https://github.com/opendilab/LMDrive/blob/ae0643d071b08b38cd326314bcbfb44aa4e60232/LAVIS/lavis/models/blip2_models/modeling_llama.py#L415
LlamaFlashAttention2
__init__()
The
LlamaFlashAttention2
has no__init__()
. As a result running the system with flash-attn will crash. https://github.com/opendilab/LMDrive/blob/ae0643d071b08b38cd326314bcbfb44aa4e60232/LAVIS/lavis/models/blip2_models/modeling_llama.py#L415