dividiti / ck-caffe

Collective Knowledge workflow for Caffe to automate installation across diverse platforms and to collaboratively evaluate and optimize Caffe-based workloads across diverse hardware, software and data sets (compilers, libraries, tools, models, inputs):
http://cKnowledge.org
BSD 3-Clause "New" or "Revised" License
193 stars 40 forks source link

Add support for ResNet #87

Closed psyhtest closed 7 years ago

psyhtest commented 7 years ago

Microsoft Research Asia designed several Deep Residual Networks.

Add CK-Caffe support for:

Note that the model topologies are from the official ResNet GitHub page, while the weights are from a mirror. (The official ResNet weights from OneDrive may not be easy to download using a script.)

Note also the use of a special ResNet_mean.binaryproto.

Rhymmor commented 7 years ago

Did I understand correctly that ResNet_mean.binaryproto should be used in caffe classification programs instead of imagenet_mean.binaryproto which is hardcoded into these programs (for default mode) right now?

psyhtest commented 7 years ago

That's right. Instead of three copies of ResNet_mean.binaryproto, we should ideally just create a separate package and set up the three ResNet ones to depend on it.

psyhtest commented 7 years ago

@Rhymmor, thanks for your contribution!

While you are at it :), could you please also add ResNet10 (weights, all topologies), which doesn't require subtracting the mean? Please also include a link to the licensing terms.

Rhymmor commented 7 years ago

@psyhtest, sure!

Hmm, I've tried it in ck and it gives the same poor prediction for any image.

---------- Prediction for /home/anatoly/CK/ctuning-datasets-min/dataset/image-jpeg-dnn-cat/cat.jpg ----------
      0.0010 - "n01491361 tiger shark, Galeocerdo cuvieri"
      0.0010 - "n01494475 hammerhead, hammerhead shark"
      0.0010 - "n01443537 goldfish, Carassius auratus"
      0.0010 - "n01440764 tench, Tinca tinca"
      0.0010 - "n01484850 great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias"
psyhtest commented 7 years ago

@Rhymmor Which program are you using? I've tried program:caffe-classification with resnet50 and got as a sensible result:

      ---------- Prediction for /home/anton/CK_REPOS/ctuning-datasets-min/dataset/image-jpeg-dnn-computer-mouse/computer_mouse.jpg ----------
      0.9935 - "n03793489 mouse, computer mouse"
      0.0059 - "n04254680 soccer ball"
      0.0001 - "n03637318 lampshade, lamp shade"
      0.0001 - "n09229709 bubble"
      0.0000 - "n03127747 crash helmet"
Rhymmor commented 7 years ago

@psyhtest, sorry, my comment wasn't clear. I've implemented and tried resnet10 in ck and get that results. BTW, resnet10 model size is 21 mb, is it normal?

psyhtest commented 7 years ago

@Rhymmor Are you surprised it's only 21 MB, not 240 MB like AlexNet? :) Well, that's the whole point - it's a more compact more with good accuracy. (Although not as compact as SqueezeNet which is under 5 MB.)

The results are very strange though. Something's not right. I see that the default command of e.g. program:caffe-classification-cuda is:

0) default ($#BIN_FILE#$ $<<CK_CAFFE_MODEL_FILE>>$ $<<CK_CAFFE_MODEL_WEIGHTS>>$ $#up_dir#$imagenet_mean.binaryproto $#up_dir#$synset_words.txt $#dataset_path#$$#dataset_filename#$)

The use of $#up_dir#$imagenet_mean.binaryproto is particularly suspicious here as this model doesn't require subtracting the mean...

psyhtest commented 7 years ago

Unfortunately, the direct link no longer works:

$ wget https://upload.uni-jena.de/data/58493041de6f79.63214979/resnet10_cvgj_iter_320000.caffemodel
--2017-04-07 20:40:07--  https://upload.uni-jena.de/data/58493041de6f79.63214979/resnet10_cvgj_iter_320000.caffemodel
Resolving upload.uni-jena.de (upload.uni-jena.de)... 141.35.105.30, 2001:638:1558:2369:1:5ee:bad:c0de
Connecting to upload.uni-jena.de (upload.uni-jena.de)|141.35.105.30|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2017-04-07 20:40:07 ERROR 404: Not Found.

And it's not possible to wget from Google Drive or Baidu Pan :(. I will close this ticket for now, as we have other priorities.