buaacyw / MeshAnything

From anything to mesh like human artists. Official impl. of "MeshAnything: Artist-Created Mesh Generation with Autoregressive Transformers"
https://buaacyw.github.io/mesh-anything/
Other
2.05k stars 91 forks source link

Are there any plans to support flash-attn 1? #25

Open mateiradu88 opened 4 months ago

mateiradu88 commented 4 months ago

Are there any plans to support flash-attn 1? This is a serious limitation at the moment for anyone still running Turing architecture, as flash-attn 2 does not support Turing.

mateiradu88 commented 4 months ago

To anyone struggling with this issue and running on Turing, simply disabling the flash-attn 2 checks in code seems to work just fine! You will then be able to run on flash-attn 1 without errors, and results look similar if not the exact same as hugging-face demo.

thomashay commented 3 months ago

@mateiradu88 How exactly did you do this? I tried config=self.config, use_flash_attention_2 = False in line 115 of meshanything.py, but that results in errors later from shape_opt.py.

mateiradu88 commented 3 months ago

I deleted the files, but I found an if condition checking for the flash attention version 2, deleting the condition and running the branch anyway worked like a charm, no other modifications as far as I remember.

thomashay commented 3 months ago

Also worked for me, thanks! For anyone else: I just removed the ifelse in line 347 in shape_opt.py (only the if and else statement, not the body of the loop, so just lines 347, 356 and 357) and could run the point cloud command line inference test.