idiap / fast-transformers

Pytorch library for fast transformer implementations
1.65k stars 179 forks source link

How to step into linear_attention.py/forward function ? #57

Closed Yogurt928 closed 3 years ago

Yogurt928 commented 3 years ago

Thanks for author's sharing. After installation, I can run the quickstart sample code successfully. To better understand the framework, I want to see the data change under debug mode. But under the debug mode, the project will not stop in the breakpoint I set in linear_attention.py/forward function

+-----------------------------------------------------------------------------+ (pytorch36) xxx:~/Code/fast-transformers/tests$ python -m pdb quickstart.py

/Code/fast-transformers/tests/quickstart.py(1)() -> import torch (Pdb) b fast_transformers/attention/linear_attention.py:56 Breakpoint 1 at /.local/lib/python3.6/site-packages/fast_transformers/attention/linear_attention.py:56 (Pdb) c Softmax: 456.4417724609375 ms Linear: 259.7819519042969 ms The program finished and will be restarted

I also tried vscode in local windows machine to remote debug, but it can not step into linear_attention.py/forward() too.

Any suggestion about it?

Thanks!

angeloskath commented 3 years ago

Hi,

The quick start code does not use linear attention. You can set linear attention if you change attention_type="full" to attention_type="linear".

I am closing the issue but feel free to reopen it if the above does not solve your problem.

Happy holidays, Angelos