I tried to create a somewhat unconventional SPDnet, where the size of SPD matrix first diminishes, and then increases back to original, in the manner of U-nets used for image segmentation. Here's my net that does 52x52->36x36->52x52:
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.trans1 = SPDTransform(52, 36) # first do some reduction
self.trans2 = SPDTransform(36, 52) # then expand back to original size
self.relu1 = nn.ReLU()
self.relu2 = nn.ReLU()
def forward(self, x):
x = self.trans1(x)
x = self.relu1(x)
x = self.trans2(x)
x = self.relu2(x)
return x
When doing a forward pass with a batch of size 8x52x52, I get this error:
File "/home/anaconda3/lib/python3.6/site-packages/spdnet/spd.py", line 49, in forward
output = torch.baddbmm(eye, torch.bmm(input, eye.transpose(1,2)), add)
RuntimeError: The expanded size of the tensor (52) must match the existing size (36) at non-singleton dimension 2. Target sizes: [8, 36, 52]. Tensor sizes: [8, 52, 36]
This happens in the forward part of SPDIncreaseDim.
Could you please point me out what I'm doing wrong, or how should I change my code or your source code to get correct behaviour? Thanks!
I tried to create a somewhat unconventional SPDnet, where the size of SPD matrix first diminishes, and then increases back to original, in the manner of U-nets used for image segmentation. Here's my net that does 52x52->36x36->52x52:
When doing a forward pass with a batch of size 8x52x52, I get this error:
File "/home/anaconda3/lib/python3.6/site-packages/spdnet/spd.py", line 49, in forward output = torch.baddbmm(eye, torch.bmm(input, eye.transpose(1,2)), add)
RuntimeError: The expanded size of the tensor (52) must match the existing size (36) at non-singleton dimension 2. Target sizes: [8, 36, 52]. Tensor sizes: [8, 52, 36]
This happens in the forward part of SPDIncreaseDim. Could you please point me out what I'm doing wrong, or how should I change my code or your source code to get correct behaviour? Thanks!