dwofk / fast-depth

ICRA 2019 "FastDepth: Fast Monocular Depth Estimation on Embedded Systems"
MIT License
926 stars 189 forks source link

The prune model cannot reimplemented #25

Open songya opened 4 years ago

songya commented 4 years ago

I downloaded your pruned model and identified the model achieved delta1 >= 0.77.

I implemented the pruned model and trained it from the pretrained model of imagenet. But its accuracy is around 0.6. lr = 0.01, weight decay = 0.0001 SGD with a momentum 0.9 => That's is I didn't change these parameters at all.

Could you change those parameters while training using NetAdapt?

dwofk commented 4 years ago

Hi @songya

In our work, we first trained our (unpruned) model and then applied network pruning to it, aiming to maintain accuracy. We fine-tuned the pruned model but did not train it from scratch. If you are training the pruned model from scratch, the achieved accuracy may very well differ from our reported metrics as we had not tried that.

Yes, it is possible to change parameters when training or fine-tuning using NetAdapt. For more detail, please refer to the NetAdapt code repo at https://github.com/denru01/netadapt, especially the customization section.