WZH0120 / SAM2-UNet

SAM2-UNet: Segment Anything 2 Makes Strong Encoder for Natural and Medical Image Segmentation
Apache License 2.0
135 stars 16 forks source link

On the question of Flash Attention #8

Closed alan-pro closed 2 months ago

alan-pro commented 2 months ago

UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at C:\actions-runner_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.) x = F.scaled_dot_product_attention( It seems to be a problem with Flash Attention, I am running on a Windows system, the GPU is 4070, do you know how to solve it?

xiongxyowo commented 2 months ago

Flash attention is enabled in the Transformer decoder of Segment Anything 2 to allow faster calculations. However, as SAM2-UNet removes SAM2's decoder, you can simply ignore this warning. Our code also disables flash attention by default.

alan-pro commented 2 months ago

Ok, thanks