Closed DanieleMenchetti closed 2 years ago
Hi,
It is caused by the multi-head attention mechanism. 512/6=85.3333, 512//6 = 85. You may need to make sure dim can be divided by num-heads.
Hi,
It is caused by the multi-head attention mechanism. 512/6=85.3333, 512//6 = 85. You may need to make sure dim can be divided by num-heads.
Thank you so much!
Hi, thank you for your work on point clouds. Could I ask you how to change the embedding module output size? I set encoder_dims and transf_dim to 512, but I'm getting a shape error on attention class (image below). Is there anything else that I should edit?
Looking for your reply, Daniele