Seul
Thanks for your contributions.
I'm beginner on explaining dnn. it seems that your project for dnn pruning is now unfinished , and it can't reproduce the paper's result.
here is my findings
missing dataset for scene 15, event 8 ...
missing transferred model saving switch
failed argparse for args.relevance, args.resume....
a raised problem , missing self.lrp()
mismatched shape in designing resnet18, here s the code
out = F.relu(self.bn1(self.conv1(x)))
out = self.layer1(out)
out = self.layer2(out)
out = self.layer3(out)
out = self.layer4(out)
out = F.avg_pool2d(out, 4)
out = out.view(out.size(0), -1)
out = self.linear(out)
compared to common resnet18,
it s lack of maxpool that should apply to the out1
the shape of tensor before flatten layer is 25088 , not 512 required, and the avgpool should avg to (1*1)
missing figure printer for result analysis for acc ...
missing pruning for a limited number of samples without fine-tuning. The original code prunes model using the entire training set.
missing dense layer pruning. orginal code works by pruning the conv layer
i have fixed some basical problem . hope this page gives some advice for others
it helps a lot if you provide after work.
Seul Thanks for your contributions. I'm beginner on explaining dnn. it seems that your project for dnn pruning is now unfinished , and it can't reproduce the paper's result. here is my findings
missing dataset for scene 15, event 8 ... missing transferred model saving switch failed argparse for args.relevance, args.resume.... a raised problem , missing self.lrp()
mismatched shape in designing resnet18, here s the code
compared to common resnet18,
missing figure printer for result analysis for acc ... missing pruning for a limited number of samples without fine-tuning. The original code prunes model using the entire training set. missing dense layer pruning. orginal code works by pruning the conv layer
i have fixed some basical problem . hope this page gives some advice for others it helps a lot if you provide after work.