Cysu / dgd_person_reid

Domain Guided Dropout for Person Re-identification
http://arxiv.org/abs/1604.07528
231 stars 94 forks source link

ERROR: Duplicate blobs produced by multiple sources #6

Closed BeSlower closed 8 years ago

BeSlower commented 8 years ago

Hi,

when I use your trained model (jstl_dgd_deploy.prototxt) to extract features on my datasets, the command line shows "duplicate blobs produced by multiple sources" error. I guess probably there is something wrong with my code, but I have no idea what is the correct way to make it work. I have to ask for your help. Could you give me some cues to extract feature on other datsets? Thanks.

Best

Cysu commented 8 years ago

It seems that there are two layers having the same top blob. Could you please provide the error log, or code snippet if available? For me, it's OK to

net = caffe.Net('jstl_dgd_deploy.prototxt', caffe.TEST)
BeSlower commented 8 years ago

This problem is due to my fault. I already fixed it. ^_^ Could you give me some cues to extract features based on your extracted_features.sh template? How to configure "weights" in your code? Thanks a lot.

parse_args() {
  task=$1
  dataset=$2
  weights=$3
  if [[ $# -eq 4 ]]; then
    blob=$4
  else
    blob=fc7_bn
  fi
}

get_output_dir() {
  parse_args $@
  weights_name=$(basename $weights)
  weights_name="${weights_name%%.*}"
  output_dir=${RESULTS_DIR}/${task}/${dataset}_${weights_name}_${blob}
  echo ${output_dir}
}
Cysu commented 8 years ago

Thanks very much for reminding me about this. Actually, the original extract_features.sh is legacy. I have updated it just now. You may git pull and try the following command

# scripts/extract_features.sh <subfolder> <dataset> <weights> [blob=fc7_bn]
scripts/extract_features.sh test viper /path/to/trained.caffemodel

Note that before extracting features on your own dataset, the data need to be converted to LMDBs first and put under external/exp/db/.

BeSlower commented 8 years ago

Thanks. It works for me. I really appreciate.