HazyResearch / fly

Apache License 2.0
194 stars 22 forks source link

Triton blocksparse matmul #4

Open justheuristic opened 2 years ago

justheuristic commented 2 years ago

Hi! First of all, thanks for an awesome paper :)

Can you please help me with an implementation question?

In blocksparse_linear.py, the is an import statement: https://github.com/HazyResearch/pixelfly/blob/7c3f233cd3b1b165ba66942e38eee3702aafea8d/src/models/modules/layers/blocksparse_linear.py#L17-L21

However, the file it is attempting to import appears to be missing in the models.modules.attention package: https://github.com/HazyResearch/pixelfly/tree/7c3f233cd3b1b165ba66942e38eee3702aafea8d/src/models/modules/attention

As a result, I'm getting "triton no supported" even though I have triton installed. What is the best way to overcome this?