When I added SPDRectified() to my model, I found that the output of the model remained consistent for all samples. But after I remove the layer, the output is normal.
My code is as follows:
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.trans = SPDTransform(16, 10)
self.rect = SPDRectified()
self.tangent = SPDTangentSpace(10)
self.linear = nn.Linear(55, 2, bias=False)
def forward(self, x):
a = self.trans(x)
b = self.rect(a)
c = self.tangent(b)
d = self.linear(c)
return d
if __name__ == '__main__':
data = scipy.io.loadmat('./db.mat')['data']
data = torch.from_numpy(data).to(torch.float)
model = Net()
pred = model(data)
print(pred)
When I added SPDRectified() to my model, I found that the output of the model remained consistent for all samples. But after I remove the layer, the output is normal.
My code is as follows:
When I added SPDRectified(), the ouput is:
When I delete SPDRectified(), the ouput is:
Why does this happen, is it because of the data?