idiap / fast-transformers

Pytorch library for fast transformer implementations
1.65k stars 179 forks source link

Error about `causal_product_cpu.cpython-38-darwin.so` on Mac #124

Open XiaoqZhang opened 1 year ago

XiaoqZhang commented 1 year ago

Hi, I am installing the package on Mac using M1 chip. I am using python=3.8.10 and torch=2.0.1. I tried to install pytorch-fast-transformers using pip install --user pytorch-fast-transforms or building from the source. However, when I tried from fast_transformers.attention import AttentionLayer, it always gives me this error:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/attention/__init__.py", line 13, in <module>
    from .causal_linear_attention import CausalLinearAttention
  File "/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/attention/causal_linear_attention.py", line 15, in <module>
    from ..causal_product import causal_dot_product
  File "/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/causal_product/__init__.py", line 9, in <module>
    from .causal_product_cpu import causal_dot_product as causal_dot_product_cpu, \
ImportError: dlopen(/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/causal_product/causal_product_cpu.cpython-38-darwin.so, 0x0002): symbol not found in flat namespace '___kmpc_for_static_fini'

Could you please help me to have a look at what the problem could be? Thank you in advance!

rezacopol commented 1 year ago

same issue.

indra-ipd commented 7 months ago

Any update on if this issue was resolved? I have the same error when I try from fast_transformers.attention import AttentionLayer