gkioxari / ActionTubes

source code for Finding Action Tubes, CVPR 2015
Other
65 stars 39 forks source link

matlab crush #1

Closed carrierlxk closed 8 years ago

carrierlxk commented 8 years ago

Hi, when I run the code to f = caffe('forward', batches(j));, my matlab crushes . So I try R-CNN, it turns out same issue. I guess may be the reason of caffe, matlab and Ubuntu version. Could you tell me the version of your caffe version and matlab version.

gkioxari commented 8 years ago

What is the message it spits out when it crashes? That will tell you what's up. I think I use a 2012b Matlab license, but I doubt that is the problem. Make sure you successfully 'make' Caffe and matcaffe.

carrierlxk commented 8 years ago

Hi, I fixed this problems by installing matlab 2012. However, when I run the feature extraction code with your pre-trained model: 'UCFsports_motion_iter_2K' and extract_fc7.prototxt, I get below error message: Error using caffe Expected 2 arguments, got 2 After I changing the input arguments to 3, the error message become that, Check failed: ReadProtoFromBinaryFile(param_file, param) Failed to parse NetParameter file: UCFsports_motion_iter_2K

Here is the detailed error message: E1102 21:54:35.484531 27409 upgrade_proto.cpp:618] Attempting to upgrade input file specified using deprecated V1LayerParameter: extract_fc7.prototxt I1102 21:54:35.484799 27409 upgrade_proto.cpp:626] Successfully upgraded file specified using deprecated V1LayerParameter I1102 21:54:35.485255 27409 net.cpp:42] Initializing net from parameters: input: "data" input_dim: 256 input_dim: 3 input_dim: 227 input_dim: 227 state { phase: TEST } layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 96 kernel_size: 7 stride: 2 } } layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1" } layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 384 kernel_size: 5 group: 2 stride: 2 } } layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2" } layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3" } layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4" } layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5" } layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } inner_product_param { num_output: 4096 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } inner_product_param { num_output: 4096 } } layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7" } layer { name: "drop7" type: "Dropout" bottom: "fc7" top: "fc7" dropout_param { dropout_ratio: 0.5 } } I1102 21:54:35.488030 27409 net.cpp:336] Input 0 -> data I1102 21:54:35.488076 27409 layer_factory.hpp:74] Creating layer conv1 I1102 21:54:35.488092 27409 net.cpp:76] Creating Layer conv1 I1102 21:54:35.488104 27409 net.cpp:372] conv1 <- data I1102 21:54:35.488114 27409 net.cpp:334] conv1 -> conv1 I1102 21:54:35.488128 27409 net.cpp:105] Setting up conv1 I1102 21:54:35.488183 27409 net.cpp:112] Top shape: 256 96 111 111 (302800896) I1102 21:54:35.488212 27409 layer_factory.hpp:74] Creating layer relu1 I1102 21:54:35.488225 27409 net.cpp:76] Creating Layer relu1 I1102 21:54:35.488234 27409 net.cpp:372] relu1 <- conv1 I1102 21:54:35.488242 27409 net.cpp:323] relu1 -> conv1 (in-place) I1102 21:54:35.488251 27409 net.cpp:105] Setting up relu1 I1102 21:54:35.488261 27409 net.cpp:112] Top shape: 256 96 111 111 (302800896) I1102 21:54:35.488270 27409 layer_factory.hpp:74] Creating layer pool1 I1102 21:54:35.488279 27409 net.cpp:76] Creating Layer pool1 I1102 21:54:35.488286 27409 net.cpp:372] pool1 <- conv1 I1102 21:54:35.488296 27409 net.cpp:334] pool1 -> pool1 I1102 21:54:35.488306 27409 net.cpp:105] Setting up pool1 I1102 21:54:35.488317 27409 net.cpp:112] Top shape: 256 96 55 55 (74342400) I1102 21:54:35.488325 27409 layer_factory.hpp:74] Creating layer norm1 I1102 21:54:35.488337 27409 net.cpp:76] Creating Layer norm1 I1102 21:54:35.488344 27409 net.cpp:372] norm1 <- pool1 I1102 21:54:35.488353 27409 net.cpp:334] norm1 -> norm1 I1102 21:54:35.488363 27409 net.cpp:105] Setting up norm1 I1102 21:54:35.488373 27409 net.cpp:112] Top shape: 256 96 55 55 (74342400) I1102 21:54:35.488380 27409 layer_factory.hpp:74] Creating layer conv2 I1102 21:54:35.488389 27409 net.cpp:76] Creating Layer conv2 I1102 21:54:35.488397 27409 net.cpp:372] conv2 <- norm1 I1102 21:54:35.488406 27409 net.cpp:334] conv2 -> conv2 I1102 21:54:35.488416 27409 net.cpp:105] Setting up conv2 I1102 21:54:35.489537 27409 net.cpp:112] Top shape: 256 384 26 26 (66453504) I1102 21:54:35.489627 27409 layer_factory.hpp:74] Creating layer relu2 I1102 21:54:35.489652 27409 net.cpp:76] Creating Layer relu2 I1102 21:54:35.489665 27409 net.cpp:372] relu2 <- conv2 I1102 21:54:35.489678 27409 net.cpp:323] relu2 -> conv2 (in-place) I1102 21:54:35.489692 27409 net.cpp:105] Setting up relu2 I1102 21:54:35.489701 27409 net.cpp:112] Top shape: 256 384 26 26 (66453504) I1102 21:54:35.489708 27409 layer_factory.hpp:74] Creating layer pool2 I1102 21:54:35.489724 27409 net.cpp:76] Creating Layer pool2 I1102 21:54:35.489733 27409 net.cpp:372] pool2 <- conv2 I1102 21:54:35.489743 27409 net.cpp:334] pool2 -> pool2 I1102 21:54:35.489755 27409 net.cpp:105] Setting up pool2 I1102 21:54:35.489768 27409 net.cpp:112] Top shape: 256 384 13 13 (16613376) I1102 21:54:35.489778 27409 layer_factory.hpp:74] Creating layer norm2 I1102 21:54:35.489789 27409 net.cpp:76] Creating Layer norm2 I1102 21:54:35.489797 27409 net.cpp:372] norm2 <- pool2 I1102 21:54:35.489806 27409 net.cpp:334] norm2 -> norm2 I1102 21:54:35.489817 27409 net.cpp:105] Setting up norm2 I1102 21:54:35.489827 27409 net.cpp:112] Top shape: 256 384 13 13 (16613376) I1102 21:54:35.489836 27409 layer_factory.hpp:74] Creating layer conv3 I1102 21:54:35.489852 27409 net.cpp:76] Creating Layer conv3 I1102 21:54:35.489861 27409 net.cpp:372] conv3 <- norm2 I1102 21:54:35.489872 27409 net.cpp:334] conv3 -> conv3 I1102 21:54:35.489884 27409 net.cpp:105] Setting up conv3 I1102 21:54:35.497197 27409 net.cpp:112] Top shape: 256 512 13 13 (22151168) I1102 21:54:35.497334 27409 layer_factory.hpp:74] Creating layer relu3 I1102 21:54:35.497365 27409 net.cpp:76] Creating Layer relu3 I1102 21:54:35.497380 27409 net.cpp:372] relu3 <- conv3 I1102 21:54:35.497400 27409 net.cpp:323] relu3 -> conv3 (in-place) I1102 21:54:35.497421 27409 net.cpp:105] Setting up relu3 I1102 21:54:35.497436 27409 net.cpp:112] Top shape: 256 512 13 13 (22151168) I1102 21:54:35.497447 27409 layer_factory.hpp:74] Creating layer conv4 I1102 21:54:35.497468 27409 net.cpp:76] Creating Layer conv4 I1102 21:54:35.497481 27409 net.cpp:372] conv4 <- conv3 I1102 21:54:35.497495 27409 net.cpp:334] conv4 -> conv4 I1102 21:54:35.497514 27409 net.cpp:105] Setting up conv4 I1102 21:54:35.501693 27409 net.cpp:112] Top shape: 256 512 13 13 (22151168) I1102 21:54:35.501786 27409 layer_factory.hpp:74] Creating layer relu4 I1102 21:54:35.501806 27409 net.cpp:76] Creating Layer relu4 I1102 21:54:35.501816 27409 net.cpp:372] relu4 <- conv4 I1102 21:54:35.501828 27409 net.cpp:323] relu4 -> conv4 (in-place) I1102 21:54:35.501840 27409 net.cpp:105] Setting up relu4 I1102 21:54:35.501849 27409 net.cpp:112] Top shape: 256 512 13 13 (22151168) I1102 21:54:35.501858 27409 layer_factory.hpp:74] Creating layer conv5 I1102 21:54:35.501870 27409 net.cpp:76] Creating Layer conv5 I1102 21:54:35.501878 27409 net.cpp:372] conv5 <- conv4 I1102 21:54:35.501888 27409 net.cpp:334] conv5 -> conv5 I1102 21:54:35.501899 27409 net.cpp:105] Setting up conv5 I1102 21:54:35.506049 27409 net.cpp:112] Top shape: 256 384 13 13 (16613376) I1102 21:54:35.506217 27409 layer_factory.hpp:74] Creating layer relu5 I1102 21:54:35.506248 27409 net.cpp:76] Creating Layer relu5 I1102 21:54:35.506263 27409 net.cpp:372] relu5 <- conv5 I1102 21:54:35.506278 27409 net.cpp:323] relu5 -> conv5 (in-place) I1102 21:54:35.506294 27409 net.cpp:105] Setting up relu5 I1102 21:54:35.506304 27409 net.cpp:112] Top shape: 256 384 13 13 (16613376) I1102 21:54:35.506314 27409 layer_factory.hpp:74] Creating layer pool5 I1102 21:54:35.506332 27409 net.cpp:76] Creating Layer pool5 I1102 21:54:35.506340 27409 net.cpp:372] pool5 <- conv5 I1102 21:54:35.506359 27409 net.cpp:334] pool5 -> pool5 I1102 21:54:35.506386 27409 net.cpp:105] Setting up pool5 I1102 21:54:35.506409 27409 net.cpp:112] Top shape: 256 384 6 6 (3538944) I1102 21:54:35.506425 27409 layer_factory.hpp:74] Creating layer fc6 I1102 21:54:35.506466 27409 net.cpp:76] Creating Layer fc6 I1102 21:54:35.506491 27409 net.cpp:372] fc6 <- pool5 I1102 21:54:35.506510 27409 net.cpp:334] fc6 -> fc6 I1102 21:54:35.506531 27409 net.cpp:105] Setting up fc6 I1102 21:54:35.702857 27409 net.cpp:112] Top shape: 256 4096 1 1 (1048576) I1102 21:54:35.703116 27409 layer_factory.hpp:74] Creating layer relu6 I1102 21:54:35.703188 27409 net.cpp:76] Creating Layer relu6 I1102 21:54:35.703217 27409 net.cpp:372] relu6 <- fc6 I1102 21:54:35.703236 27409 net.cpp:323] relu6 -> fc6 (in-place) I1102 21:54:35.703253 27409 net.cpp:105] Setting up relu6 I1102 21:54:35.703263 27409 net.cpp:112] Top shape: 256 4096 1 1 (1048576) I1102 21:54:35.703272 27409 layer_factory.hpp:74] Creating layer drop6 I1102 21:54:35.703294 27409 net.cpp:76] Creating Layer drop6 I1102 21:54:35.703302 27409 net.cpp:372] drop6 <- fc6 I1102 21:54:35.703312 27409 net.cpp:323] drop6 -> fc6 (in-place) I1102 21:54:35.703322 27409 net.cpp:105] Setting up drop6 I1102 21:54:35.703331 27409 net.cpp:112] Top shape: 256 4096 1 1 (1048576) I1102 21:54:35.703341 27409 layer_factory.hpp:74] Creating layer fc7 I1102 21:54:35.703361 27409 net.cpp:76] Creating Layer fc7 I1102 21:54:35.703382 27409 net.cpp:372] fc7 <- fc6 I1102 21:54:35.703402 27409 net.cpp:334] fc7 -> fc7 I1102 21:54:35.703428 27409 net.cpp:105] Setting up fc7 I1102 21:54:35.767624 27409 net.cpp:112] Top shape: 256 4096 1 1 (1048576) I1102 21:54:35.767773 27409 layer_factory.hpp:74] Creating layer relu7 I1102 21:54:35.767812 27409 net.cpp:76] Creating Layer relu7 I1102 21:54:35.767834 27409 net.cpp:372] relu7 <- fc7 I1102 21:54:35.767856 27409 net.cpp:323] relu7 -> fc7 (in-place) I1102 21:54:35.767880 27409 net.cpp:105] Setting up relu7 I1102 21:54:35.767902 27409 net.cpp:112] Top shape: 256 4096 1 1 (1048576) I1102 21:54:35.767917 27409 layer_factory.hpp:74] Creating layer drop7 I1102 21:54:35.767940 27409 net.cpp:76] Creating Layer drop7 I1102 21:54:35.767954 27409 net.cpp:372] drop7 <- fc7 I1102 21:54:35.767971 27409 net.cpp:323] drop7 -> fc7 (in-place) I1102 21:54:35.767988 27409 net.cpp:105] Setting up drop7 I1102 21:54:35.768004 27409 net.cpp:112] Top shape: 256 4096 1 1 (1048576) I1102 21:54:35.768017 27409 net.cpp:165] drop7 does not need backward computation. I1102 21:54:35.768029 27409 net.cpp:165] relu7 does not need backward computation. I1102 21:54:35.768041 27409 net.cpp:165] fc7 does not need backward computation. I1102 21:54:35.768054 27409 net.cpp:165] drop6 does not need backward computation. I1102 21:54:35.768064 27409 net.cpp:165] relu6 does not need backward computation. I1102 21:54:35.768076 27409 net.cpp:165] fc6 does not need backward computation. I1102 21:54:35.768090 27409 net.cpp:165] pool5 does not need backward computation. I1102 21:54:35.768103 27409 net.cpp:165] relu5 does not need backward computation. I1102 21:54:35.768118 27409 net.cpp:165] conv5 does not need backward computation. I1102 21:54:35.768132 27409 net.cpp:165] relu4 does not need backward computation. I1102 21:54:35.768146 27409 net.cpp:165] conv4 does not need backward computation. I1102 21:54:35.768163 27409 net.cpp:165] relu3 does not need backward computation. I1102 21:54:35.768175 27409 net.cpp:165] conv3 does not need backward computation. I1102 21:54:35.768189 27409 net.cpp:165] norm2 does not need backward computation. I1102 21:54:35.768215 27409 net.cpp:165] pool2 does not need backward computation. I1102 21:54:35.768229 27409 net.cpp:165] relu2 does not need backward computation. I1102 21:54:35.768246 27409 net.cpp:165] conv2 does not need backward computation. I1102 21:54:35.768261 27409 net.cpp:165] norm1 does not need backward computation. I1102 21:54:35.768273 27409 net.cpp:165] pool1 does not need backward computation. I1102 21:54:35.768285 27409 net.cpp:165] relu1 does not need backward computation. I1102 21:54:35.768301 27409 net.cpp:165] conv1 does not need backward computation. I1102 21:54:35.768313 27409 net.cpp:201] This network produces output fc7 I1102 21:54:35.768348 27409 net.cpp:446] Collecting Learning Rate and Weight Decay. I1102 21:54:35.768399 27409 net.cpp:213] Network initialization done. I1102 21:54:35.768415 27409 net.cpp:214] Memory required for data: 4208328704 F1102 21:54:36.863123 27409 upgrade_proto.cpp:935] Check failed: ReadProtoFromBinaryFile(param_file, param) Failed to parse NetParameter file: UCFsports_motion_iter_2K