arthurdouillard / incremental_learning.pytorch

A collection of incremental learning paper implementations including PODNet (ECCV20) and Ghost (CVPR-W21).
MIT License
388 stars 60 forks source link

result on cifar100(5 step) using ucir_resnet #25

Closed sega-hsj closed 4 years ago

sega-hsj commented 4 years ago

I think the Resnet used in UCIR's code is more standard, so I run the experiment on cifar100(5 step) using your ucir_resnet.py, the result is worse than before a lot. Do I need to modify some other configurations to get a better result?

that is what I modify on your ucir_resnet.py

def forward(self, x, **kwargs):
    x = self.conv1(x)
    x = self.bn1(x)
    x = self.relu(x)

    fea1 = self.layer1(x)
    fea2 = self.layer2(fea1)
    fea3 = self.layer3(fea2)

    attentions = [fea1, fea2, fea3]
    raw_features = self.end_features(fea3)
    features = self.end_features(F.relu(fea3, inplace=False))

    return {"raw_features": raw_features, "features": features, "attention": attentions}

def end_features(self, x):
    x = self.avgpool(x)
    x = x.view(x.size(0), -1)

    return x
sega-hsj commented 4 years ago

my result: 0.772, 0.65, 0.606, 0.553, 0.506, 0.503. [train.py]: Individual results avg: [59.83] [train.py]: Individual results last: [50.3] [train.py]: Individual results forget: [18.32]