Closed bbussolino closed 4 years ago
Hi @beatricebussolino. I am really sorry for the late reply. Actually its was incorrect in the code, and thank you very much for noticing it. I have patched the code (line 352m in capslayers.py). Hope this fixed the issue.
Hi @brjathu, thanks for your great work and for providing the code. I am writing for a doubt on the dimension on which the squash function is applied.
From capslayers.py lines 579/585 I believe to understand that the norm of the tensor is computed along the highest dimension (axis = -1).
In capslayers.py line 351, is it correct that the activations have shape [batch_size, # of output capsules, # of neurons in output capsules, height, width] ? If this is correct, doing the squash along axis=-1 would mean to do the norm along the width rather than along the neurons of the capsules.
Can you please explain me if I am missing something in the code? Thank you very much