The-AI-Summer / self-attention-cv

Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
https://theaisummer.com/
MIT License
1.18k stars 154 forks source link

AxialAttentionBlock : Doesn't work on gpu #3

Closed sahilrider closed 3 years ago

sahilrider commented 3 years ago

The code is currently supported for cpu. I tried running for gpu but its given the following error in relative_pos_enc_qkv.py file. I tried making some changes to change device for inputs but its still not working.

/usr/local/lib/python3.7/dist-packages/self_attention_cv/pos_embeddings/relative_pos_enc_qkv.py in forward(self)
     36 
     37     def forward(self):
---> 38         all_embeddings = torch.index_select(self.relative, 1, self.relative_index_2d)  # [head_planes , (dim*dim)]
     39 
     40         all_embeddings = rearrange(all_embeddings, ' c (x y)  -> c x y', x=self.dim)

RuntimeError: Input, output and indices must be on the current device`
black0017 commented 3 years ago

Hello!

here is the update: https://github.com/The-AI-Summer/self-attention-cv/blob/main/self_attention_cv/pos_embeddings/relative_pos_enc_qkv.py#L38

i added this line that solves the problem:

rel_indx = self.relative_index_2d.to(self.relative.device)

feel free to reopen the issue if you find anything suspicious.