lucidrains / sinkhorn-transformer

Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
MIT License
253 stars 21 forks source link

No module named 'local_attention' #26

Closed asigalov61 closed 3 years ago

asigalov61 commented 3 years ago

Hey @lucidrains

I was trying to run the simple wiki8 example in Google Colab and it failed on me with this:

Do I have to import something extra? Can you help, please?

Thanks.


ModuleNotFoundError Traceback (most recent call last)

in () ----> 1 from sinkhorn_transformer import SinkhornTransformerLM 2 3 from sinkhorn_transformer.autoregressive_wrapper import AutoregressiveWrapper 4 5 import random 1 frames /usr/local/lib/python3.7/dist-packages/sinkhorn_transformer/__init__.py in () ----> 1 from sinkhorn_transformer.sinkhorn_transformer import SinkhornTransformer, SinkhornTransformerLM, SinkhornSelfAttention 2 from sinkhorn_transformer.autoregressive_wrapper import AutoregressiveWrapper 3 from sinkhorn_transformer.autopadder import Autopadder /usr/local/lib/python3.7/dist-packages/sinkhorn_transformer/sinkhorn_transformer.py in () 8 from functools import partial, wraps, reduce 9 ---> 10 from local_attention import LocalAttention 11 from axial_positional_embedding import AxialPositionalEmbedding 12 from product_key_memory import PKM ModuleNotFoundError: No module named 'local_attention'
lucidrains commented 3 years ago

@asigalov61 oh crap! fixed in 0.11.3 🙏 https://github.com/lucidrains/sinkhorn-transformer/commit/76de0f4a48f804058efbb80d8f4f8a8630a4fa0b

asigalov61 commented 3 years ago

@lucidrains Thanks a lot, bro! :) I will test and let you know of any issues if any. Thanks.