ShaoqingRen / faster_rcnn

Faster R-CNN
Other
2.71k stars 1.22k forks source link

Matlab system error - when running experiments/script_faster_rcnn_demo.m #23

Open euwern opened 9 years ago

euwern commented 9 years ago

I tried to run "experiments/script_faster_rcnn_demo.m" as specified in the instruction, and my matlab crashed with the following crash report

I ran the code in Ubuntu 14.04. I hope someone can shed some light on what is going on.


          abort() detected at Sat Oct 17 10:04:52 2015

Configuration: Crash Decoding : Disabled Current Visual : 0x20 (class 4, depth 24) Default Encoding : UTF-8 GNU C Library : 2.19 stable MATLAB Architecture: glnxa64 MATLAB Root : /data/mat14a MATLAB Version : 8.3.0.532 (R2014a) Operating System : Linux 3.13.0-65-generic #105-Ubuntu SMP Mon Sep 21 18:50:58 UTC 2015 x86_64 Processor ID : x86 Family 6 Model 63 Stepping 2, GenuineIntel Window System : The X.Org Foundation (11701000), display localhost:11.0

Fault Count: 1

Abnormal termination: abort()

Register State (from fault): RAX = 0000000000000000 RBX = 00007f4c0c79b620 RCX = ffffffffffffffff RDX = 0000000000000006 RSP = 00007f4c4bf89458 RBP = 00007f4c4bf89590 RSI = 00000000000030aa RDI = 0000000000003085

R8 = 000000000000ff08 R9 = ffffffffffff1150 R10 = 0000000000000008 R11 = 0000000000000202 R12 = 0000000000000001 R13 = 00007f4c4bf897d0 R14 = 00007f4c4bf8a040 R15 = 0000000000000001

RIP = 00007f4c5e110cc9 EFL = 0000000000000202

CS = 0033 FS = 0000 GS = 0000

Stack Trace (from fault): [ 0] 0x00007f4c5e110cc9 /lib/x86_64-linux-gnu/libc.so.6+00224457 gsignal+00000057 [ 1] 0x00007f4c5e1140d8 /lib/x86_64-linux-gnu/libc.so.6+00237784 abort+00000328 [ 2] 0x00007f4c0c575d81 /usr/lib/x86_64-linux-gnu/libglog.so.0+00068993 _ZN6google22InstallFailureFunctionEPFvvE+00000000 [ 3] 0x00007f4c0c575daa /usr/lib/x86_64-linux-gnu/libglog.so.0+00069034 _ZN6google10LogMessage10SendToSinkEv+00000000 [ 4] 0x00007f4c0c575ce4 /usr/lib/x86_64-linux-gnu/libglog.so.0+00068836 _ZN6google10LogMessage9SendToLogEv+00001224 [ 5] 0x00007f4c0c5756e6 /usr/lib/x86_64-linux-gnu/libglog.so.0+00067302 _ZN6google10LogMessage5FlushEv+00000414 [ 6] 0x00007f4c0c578687 /usr/lib/x86_64-linux-gnu/libglog.so.0+00079495 _ZN6google15LogMessageFatalD1Ev+00000025 [ 7] 0x00007f4c1657421e /home/umteht/fasterrcnn/experiments/external/caffe/matlab/+caffe/private/caffe.mexa64+01430046 [ 8] 0x00007f4c165bf823 /home/umteht/fasterrcnn/experiments/external/caffe/matlab/+caffe/private/caffe.mexa64+01738787 [ 9] 0x00007f4c16454702 /home/umteht/fasterrcnn/experiments/external/caffe/matlab/+caffe/private/caffe.mexa64+00251650 [ 10] 0x00007f4c164551c7 /home/umteht/fasterrcnn/experiments/external/caffe/matlab/+caffe/private/caffe.mexa64+00254407 mexFunction+00000154 [ 11] 0x00007f4c5603b72a /data/mat14a/bin/glnxa64/libmex.so+00120618 mexRunMexFile+00000090 [ 12] 0x00007f4c56037a94 /data/mat14a/bin/glnxa64/libmex.so+00105108 [ 13] 0x00007f4c56038fb4 /data/mat14a/bin/glnxa64/libmex.so+00110516 [ 14] 0x00007f4c55432ad9 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 15] 0x00007f4c5450920e /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 16] 0x00007f4c544c41d0 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 17] 0x00007f4c544c61ea /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02327018 [ 18] 0x00007f4c544c9167 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 19] 0x00007f4c544c726f /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 20] 0x00007f4c544c7ec4 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 21] 0x00007f4c5452530b /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 22] 0x00007f4c55432ad9 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 23] 0x00007f4c5506a6e8 /data/mat14a/bin/glnxa64/libmwmcos.so+01566440 [ 24] 0x00007f4c55014482 /data/mat14a/bin/glnxa64/libmwmcos.so+01213570 [ 25] 0x00007f4c55016465 /data/mat14a/bin/glnxa64/libmwmcos.so+01221733 [ 26] 0x00007f4c55018e50 /data/mat14a/bin/glnxa64/libmwmcos.so+01232464 [ 27] 0x00007f4c5501673d /data/mat14a/bin/glnxa64/libmwmcos.so+01222461 [ 28] 0x00007f4c5506d126 /data/mat14a/bin/glnxa64/libmwmcos.so+01577254 [ 29] 0x00007f4c550d955b /data/mat14a/bin/glnxa64/libmwmcos.so+02020699 [ 30] 0x00007f4c553e1874 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00338036 _ZN13Mfh_MATLAB_fn11dispatch_fhEiPP11mxArraytagiS2+00000244 [ 31] 0x00007f4c550d9031 /data/mat14a/bin/glnxa64/libmwmcos.so+02019377 [ 32] 0x00007f4c5450920e /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 33] 0x00007f4c544c41d0 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 34] 0x00007f4c544c61ea /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02327018 [ 35] 0x00007f4c544c9167 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 36] 0x00007f4c544c726f /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 37] 0x00007f4c544c7ec4 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 38] 0x00007f4c5452530b /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 39] 0x00007f4c55432ad9 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 40] 0x00007f4c5506a6e8 /data/mat14a/bin/glnxa64/libmwmcos.so+01566440 [ 41] 0x00007f4c5506a995 /data/mat14a/bin/glnxa64/libmwmcos.so+01567125 [ 42] 0x00007f4c55014236 /data/mat14a/bin/glnxa64/libmwmcos.so+01212982 [ 43] 0x00007f4c5501498c /data/mat14a/bin/glnxa64/libmwmcos.so+01214860 [ 44] 0x00007f4c5501638e /data/mat14a/bin/glnxa64/libmwmcos.so+01221518 [ 45] 0x00007f4c55018e50 /data/mat14a/bin/glnxa64/libmwmcos.so+01232464 [ 46] 0x00007f4c55016a9c /data/mat14a/bin/glnxa64/libmwmcos.so+01223324 [ 47] 0x00007f4c55016be9 /data/mat14a/bin/glnxa64/libmwmcos.so+01223657 [ 48] 0x00007f4c55016dcf /data/mat14a/bin/glnxa64/libmwmcos.so+01224143 [ 49] 0x00007f4c55016ff1 /data/mat14a/bin/glnxa64/libmwmcos.so+01224689 [ 50] 0x00007f4c55072717 /data/mat14a/bin/glnxa64/libmwmcos.so+01599255 [ 51] 0x00007f4c550d91f8 /data/mat14a/bin/glnxa64/libmwmcos.so+02019832 [ 52] 0x00007f4c553e1874 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00338036 _ZN13Mfh_MATLAB_fn11dispatch_fhEiPP11mxArraytagiS2+00000244 [ 53] 0x00007f4c550d9031 /data/mat14a/bin/glnxa64/libmwmcos.so+02019377 [ 54] 0x00007f4c5450920e /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 55] 0x00007f4c544c41d0 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 56] 0x00007f4c544c61ea /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02327018 [ 57] 0x00007f4c544c9167 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 58] 0x00007f4c544c726f /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 59] 0x00007f4c544c7ec4 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 60] 0x00007f4c5452530b /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 61] 0x00007f4c55432ad9 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 62] 0x00007f4c5450920e /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 63] 0x00007f4c544c41d0 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 64] 0x00007f4c544c62b0 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02327216 [ 65] 0x00007f4c544c9167 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 66] 0x00007f4c544c726f /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 67] 0x00007f4c544c8245 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02335301 [ 68] 0x00007f4c544b9a4f /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02275919 [ 69] 0x00007f4c544becb9 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02297017 [ 70] 0x00007f4c544bc979 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02287993 [ 71] 0x00007f4c544bced6 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02289366 [ 72] 0x00007f4c544b7f08 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02268936 [ 73] 0x00007f4c544b829a /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02269850 [ 74] 0x00007f4c553f4de3 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00417251 [ 75] 0x00007f4c553e1874 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00338036 _ZN13Mfh_MATLAB_fn11dispatch_fhEiPP11mxArraytagiS2+00000244 [ 76] 0x00007f4c5450920e /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 77] 0x00007f4c544c41d0 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 78] 0x00007f4c544c61ea /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02327018 [ 79] 0x00007f4c544c9167 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 80] 0x00007f4c544c726f /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 81] 0x00007f4c544c7ec4 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 82] 0x00007f4c5452530b /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 83] 0x00007f4c55432ad9 /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 84] 0x00007f4c5450920e /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 85] 0x00007f4c544c41d0 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 86] 0x00007f4c544c61ea /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02327018 [ 87] 0x00007f4c544c9167 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 88] 0x00007f4c544c726f /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 89] 0x00007f4c544c7ec4 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 90] 0x00007f4c5452530b /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 91] 0x00007f4c55432c5f /data/mat14a/bin/glnxa64/libmwm_dispatcher.so+00670815 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00001087 [ 92] 0x00007f4c544f8135 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02531637 [ 93] 0x00007f4c544bf0d9 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02298073 [ 94] 0x00007f4c544bbdc7 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02284999 [ 95] 0x00007f4c544bc193 /data/mat14a/bin/glnxa64/libmwm_interpreter.so+02285971 [ 96] 0x00007f4c56265afc /data/mat14a/bin/glnxa64/libmwbridge.so+00142076 [ 97] 0x00007f4c56266791 /data/mat14a/bin/glnxa64/libmwbridge.so+00145297 _Z8mnParserv+00000721 [ 98] 0x00007f4c5f51c92f /data/mat14a/bin/glnxa64/libmwmcr.so+00489775 _ZN11mcrInstance30mnParser_on_interpreter_threadEv+00000031 [ 99] 0x00007f4c5f4fdb6d /data/mat14a/bin/glnxa64/libmwmcr.so+00363373 [100] 0x00007f4c5f4fdbe9 /data/mat14a/bin/glnxa64/libmwmcr.so+00363497 [101] 0x00007f4c53bf1d46 /data/mat14a/bin/glnxa64/libmwuix.so+00343366 [102] 0x00007f4c53bd4382 /data/mat14a/bin/glnxa64/libmwuix.so+00222082 [103] 0x00007f4c5fc7250f /data/mat14a/bin/glnxa64/libmwservices.so+02323727 [104] 0x00007f4c5fc7267c /data/mat14a/bin/glnxa64/libmwservices.so+02324092 [105] 0x00007f4c5fc6e57f /data/mat14a/bin/glnxa64/libmwservices.so+02307455 [106] 0x00007f4c5fc739b5 /data/mat14a/bin/glnxa64/libmwservices.so+02329013 [107] 0x00007f4c5fc73de7 /data/mat14a/bin/glnxa64/libmwservices.so+02330087 [108] 0x00007f4c5fc744c0 /data/mat14a/bin/glnxa64/libmwservices.so+02331840 _Z25svWS_ProcessPendingEventsiib+00000080 [109] 0x00007f4c5f4fe098 /data/mat14a/bin/glnxa64/libmwmcr.so+00364696 [110] 0x00007f4c5f4fe3bf /data/mat14a/bin/glnxa64/libmwmcr.so+00365503 [111] 0x00007f4c5f4f928f /data/mat14a/bin/glnxa64/libmwmcr.so+00344719 [112] 0x00007f4c5e4a7182 /lib/x86_64-linux-gnu/libpthread.so.0+00033154 [113] 0x00007f4c5e1d447d /lib/x86_64-linux-gnu/libc.so.6+01025149 clone+00000109

This error was detected while a MEX-file was running. If the MEX-file is not an official MathWorks function, please examine its source code for errors. Please consult the External Interfaces Guide for information on debugging MEX-files.

If this problem is reproducible, please submit a Service Request via: http://www.mathworks.com/support/contact_us/

A technical support engineer might contact you with further information.

Thank you for your help.

ShaoqingRen commented 9 years ago

@euwern Please uncomment % caffe.init_log(fullfile(pwd, 'caffe_log')); in Line 31, script_faster_rcnn_demo.m, run the script and check the error message in log.

euwern commented 9 years ago

I get the following error, after uncmment line 31. Undefined variable "caffe" or class "caffe.init_log".

Error in script_faster_rcnn_demo (line 31) caffe.init_log(fullfile(pwd, 'caffe_log'));

Error in run (line 63) evalin('caller', [script ';']);

ShaoqingRen commented 9 years ago

it is in https://github.com/ShaoqingRen/caffe/blob/faster-R-CNN/matlab/%2Bcaffe/init_log.m. Did you use caffe in https://github.com/ShaoqingRen/faster_rcnn/tree/master/external

euwern commented 9 years ago

The external/caffe file was empty in my source (hence I grabbed caffe from the original source). I removed the entire faster_rcnn directory and start from scratch. The following is what I have done so far.

  1. I cloned the entire repo with --recursive. (the intention is to grab the source in external/caffe)
  2. I have copied the external/caffe to experiment/external/caffe
  3. I ran "make all" and "make matcaffe" in both external/caffe and experiment/external/caffe
  4. I launched matlab and run "faster_rcnn_build" and "startup" followed by "fetch_data/fetch_faster_rcnn_final_model.m"
  5. I changed one line of code in script_faster_rcnn_demo.m as it is not looking at the correct folder file loading the file "faster_rcnn/output/faster_rcnn_final/faster_rcnn_VOC0712_vgg_16layers
  6. I tried running "experiments/script_faster_rcnn_demo.m" and I get the following error

../faster_rcnn/experiments/caffe_log Invalid MEX-file '../faster_rcnn/bin/nms_gpu_mex.mexa64': libcudart.so.6.5: cannot open shared object file: No such file or directory

Error in nms (line 37) pick = nms_gpu_mex(single(boxes)', double(overlap));

Error in script_faster_rcnn_demo>boxes_filter (line 148) aboxes = aboxes(nms(aboxes, nms_overlap_thres, use_gpu), :);

Error in script_faster_rcnn_demo (line 55) aboxes = boxes_filter([boxes, scores], opts.per_nms_topN, opts.nms_overlap_thres, opts.after_nms_topN, opts.use_gpu);

Error in run (line 63) evalin('caller', [script ';']);

YingjieYin commented 9 years ago

I tried to run "experiments/script_faster_rcnn_demo.m" as specified in the instruction, and my matlab stopped at "output_blobs = caffe_net.forward(net_inputs);" in " proposal_im_detect.m" I get the following error

fastrcnn startup done GPU 1: free memory 12538343424 GPU 2: free memory 12552810496 Use GPU 2 错误使用 caffe glog check error, please check log and clear mex

出错 caffe.Blob/setdata (line 27) caffe('blob_set_data', self.hBlob_self, data);

出错 caffe.Net/forward (line 140) self.blobs(self.inputs{n}).set_data(input_data{n});

出错 proposal_im_detect (line 23) output_blobs = caffe_net.forward(net_inputs);

出错 script_faster_rcnn_demo (line 54) [boxes, scores] = proposal_im_detect(proposal_detection_model.conf_proposal, rpn_net, im);

ShaoqingRen commented 8 years ago

@euwern According to the error message ../faster_rcnn/experiments/caffe_log Invalid MEX-file '../faster_rcnn/bin/nms_gpu_mex.mexa64': libcudart.so.6.5: cannot open shared object file: No such file or directory

You bulid the nms_gpu_mex with cuda 6.5 (config in functions/nms/nvmex.m), do you have install cuda 6.5?

ShaoqingRen commented 8 years ago

@YingjieYin Please also uncomment % caffe.init_log(fullfile(pwd, 'caffe_log')); in Line 31, script_faster_rcnn_demo.m, run the script and check the error message in log.

northeastsquare commented 8 years ago

@ShaoqingRen Hello, I met the same problem. I uncomment % caffe.init_log(fullfile(pwd, 'caffe_log')); in Line 31,got log: Log file created at: 2015/11/08 13:24:31 Running on machine: silva-XPS-8300 Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg I1108 13:24:31.010450 3340 net.cpp:42] Initializing net from parameters: name: "VGG_ILSVRC_16" input: "data" input_dim: 1 input_dim: 3 input_dim: 224 input_dim: 224 state { phase: TEST } layer { name: "conv1_1" type: "Convolution" bottom: "data" top: "conv1_1" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 64 pad: 1 kernel_size: 3 } } layer { name: "relu1_1" type: "ReLU" bottom: "conv1_1" top: "conv1_1" } layer { name: "conv1_2" type: "Convolution" bottom: "conv1_1" top: "conv1_2" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 64 pad: 1 kernel_size: 3 } } layer { name: "relu1_2" type: "ReLU" bottom: "conv1_2" top: "conv1_2" } layer { name: "pool1" type: "Pooling" bottom: "conv1_2" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv2_1" type: "Convolution" bottom: "pool1" top: "conv2_1" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 128 pad: 1 kernel_size: 3 } } layer { name: "relu2_1" type: "ReLU" bottom: "conv2_1" top: "conv2_1" } layer { name: "conv2_2" type: "Convolution" bottom: "conv2_1" top: "conv2_2" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 128 pad: 1 kernel_size: 3 } } layer { name: "relu2_2" type: "ReLU" bottom: "conv2_2" top: "conv2_2" } layer { name: "pool2" type: "Pooling" bottom: "conv2_2" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv3_1" type: "Convolution" bottom: "pool2" top: "conv3_1" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 256 pad: 1 kernel_size: 3 } } layer { name: "relu3_1" type: "ReLU" bottom: "conv3_1" top: "conv3_1" } layer { name: "conv3_2" type: "Convolution" bottom: "conv3_1" top: "conv3_2" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 256 pad: 1 kernel_size: 3 } } layer { name: "relu3_2" type: "ReLU" bottom: "conv3_2" top: "conv3_2" } layer { name: "conv3_3" type: "Convolution" bottom: "conv3_2" top: "conv3_3" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 256 pad: 1 kernel_size: 3 } } layer { name: "relu3_3" type: "ReLU" bottom: "conv3_3" top: "conv3_3" } layer { name: "pool3" type: "Pooling" bottom: "conv3_3" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv4_1" type: "Convolution" bottom: "pool3" top: "conv4_1" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu4_1" type: "ReLU" bottom: "conv4_1" top: "conv4_1" } layer { name: "conv4_2" type: "Convolution" bottom: "conv4_1" top: "conv4_2" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu4_2" type: "ReLU" bottom: "conv4_2" top: "conv4_2" } layer { name: "conv4_3" type: "Convolution" bottom: "conv4_2" top: "conv4_3" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu4_3" type: "ReLU" bottom: "conv4_3" top: "conv4_3" } layer { name: "pool4" type: "Pooling" bottom: "conv4_3" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "conv5_1" type: "Convolution" bottom: "pool4" top: "conv5_1" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu5_1" type: "ReLU" bottom: "conv5_1" top: "conv5_1" } layer { name: "conv5_2" type: "Convolution" bottom: "conv5_1" top: "conv5_2" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu5_2" type: "ReLU" bottom: "conv5_2" top: "conv5_2" } layer { name: "conv5_3" type: "Convolution" bottom: "conv5_2" top: "conv5_3" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 } } layer { name: "relu5_3" type: "ReLU" bottom: "conv5_3" top: "conv5_3" } layer { name: "conv_proposal1" type: "Convolution" bottom: "conv5_3" top: "conv_proposal1" param { lr_mult: 1 } param { lr_mult: 2 } convolution_param { num_output: 512 pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "relu_proposal1" type: "ReLU" bottom: "conv_proposal1" top: "conv_proposal1" } layer { name: "proposal_cls_score" type: "Convolution" bottom: "conv_proposal1" top: "proposal_cls_score" param { lr_mult: 1 } param { lr_mult: 2 } convolution_param { num_output: 18 pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "proposal_bbox_pred" type: "Convolution" bottom: "conv_proposal1" top: "proposal_bbox_pred" param { lr_mult: 1 } param { lr_mult: 2 } convolution_param { num_output: 36 pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "proposal_cls_score_reshape" type: "Reshape" bottom: "proposal_cls_score" top: "proposal_cls_score_reshape" reshape_param { shape { dim: 0 dim: 2 dim: -1 dim: 0 } } } layer { name: "proposal_cls_prob" type: "Softmax" bottom: "proposal_cls_score_reshape" top: "proposal_cls_prob" } I1108 13:24:31.017549 3340 net.cpp:380] Input 0 -> data I1108 13:24:31.018478 3340 layer_factory.hpp:74] Creating layer conv1_1 I1108 13:24:31.018877 3340 net.cpp:90] Creating Layer conv1_1 I1108 13:24:31.018892 3340 net.cpp:420] conv1_1 <- data I1108 13:24:31.018904 3340 net.cpp:378] conv1_1 -> conv1_1 I1108 13:24:31.018920 3340 net.cpp:120] Setting up conv1_1 I1108 13:24:31.094996 3340 net.cpp:127] Top shape: 1 64 224 224 (3211264) I1108 13:24:31.095026 3340 layer_factory.hpp:74] Creating layer relu1_1 I1108 13:24:31.095036 3340 net.cpp:90] Creating Layer relu1_1 I1108 13:24:31.095042 3340 net.cpp:420] relu1_1 <- conv1_1 I1108 13:24:31.095048 3340 net.cpp:367] relu1_1 -> conv1_1 (in-place) I1108 13:24:31.095057 3340 net.cpp:120] Setting up relu1_1 I1108 13:24:31.095116 3340 net.cpp:127] Top shape: 1 64 224 224 (3211264) I1108 13:24:31.095123 3340 layer_factory.hpp:74] Creating layer conv1_2 I1108 13:24:31.095131 3340 net.cpp:90] Creating Layer conv1_2 I1108 13:24:31.095135 3340 net.cpp:420] conv1_2 <- conv1_1 I1108 13:24:31.095141 3340 net.cpp:378] conv1_2 -> conv1_2 I1108 13:24:31.095149 3340 net.cpp:120] Setting up conv1_2 I1108 13:24:31.095491 3340 net.cpp:127] Top shape: 1 64 224 224 (3211264) I1108 13:24:31.095502 3340 layer_factory.hpp:74] Creating layer relu1_2 I1108 13:24:31.095510 3340 net.cpp:90] Creating Layer relu1_2 I1108 13:24:31.095513 3340 net.cpp:420] relu1_2 <- conv1_2 I1108 13:24:31.095520 3340 net.cpp:367] relu1_2 -> conv1_2 (in-place) I1108 13:24:31.095525 3340 net.cpp:120] Setting up relu1_2 I1108 13:24:31.095578 3340 net.cpp:127] Top shape: 1 64 224 224 (3211264) I1108 13:24:31.095583 3340 layer_factory.hpp:74] Creating layer pool1 I1108 13:24:31.095594 3340 net.cpp:90] Creating Layer pool1 I1108 13:24:31.095599 3340 net.cpp:420] pool1 <- conv1_2 I1108 13:24:31.095604 3340 net.cpp:378] pool1 -> pool1 I1108 13:24:31.095610 3340 net.cpp:120] Setting up pool1 I1108 13:24:31.097164 3340 net.cpp:127] Top shape: 1 64 112 112 (802816) I1108 13:24:31.097187 3340 layer_factory.hpp:74] Creating layer conv2_1 I1108 13:24:31.097205 3340 net.cpp:90] Creating Layer conv2_1 I1108 13:24:31.097210 3340 net.cpp:420] conv2_1 <- pool1 I1108 13:24:31.097216 3340 net.cpp:378] conv2_1 -> conv2_1 I1108 13:24:31.097224 3340 net.cpp:120] Setting up conv2_1 I1108 13:24:31.097646 3340 net.cpp:127] Top shape: 1 128 112 112 (1605632) I1108 13:24:31.097659 3340 layer_factory.hpp:74] Creating layer relu2_1 I1108 13:24:31.097666 3340 net.cpp:90] Creating Layer relu2_1 I1108 13:24:31.097671 3340 net.cpp:420] relu2_1 <- conv2_1 I1108 13:24:31.097676 3340 net.cpp:367] relu2_1 -> conv2_1 (in-place) I1108 13:24:31.097681 3340 net.cpp:120] Setting up relu2_1 I1108 13:24:31.097842 3340 net.cpp:127] Top shape: 1 128 112 112 (1605632) I1108 13:24:31.097851 3340 layer_factory.hpp:74] Creating layer conv2_2 I1108 13:24:31.097857 3340 net.cpp:90] Creating Layer conv2_2 I1108 13:24:31.097862 3340 net.cpp:420] conv2_2 <- conv2_1 I1108 13:24:31.097868 3340 net.cpp:378] conv2_2 -> conv2_2 I1108 13:24:31.097874 3340 net.cpp:120] Setting up conv2_2 I1108 13:24:31.098400 3340 net.cpp:127] Top shape: 1 128 112 112 (1605632) I1108 13:24:31.098410 3340 layer_factory.hpp:74] Creating layer relu2_2 I1108 13:24:31.098418 3340 net.cpp:90] Creating Layer relu2_2 I1108 13:24:31.098423 3340 net.cpp:420] relu2_2 <- conv2_2 I1108 13:24:31.098428 3340 net.cpp:367] relu2_2 -> conv2_2 (in-place) I1108 13:24:31.098434 3340 net.cpp:120] Setting up relu2_2 I1108 13:24:31.098491 3340 net.cpp:127] Top shape: 1 128 112 112 (1605632) I1108 13:24:31.098497 3340 layer_factory.hpp:74] Creating layer pool2 I1108 13:24:31.098505 3340 net.cpp:90] Creating Layer pool2 I1108 13:24:31.098508 3340 net.cpp:420] pool2 <- conv2_2 I1108 13:24:31.098515 3340 net.cpp:378] pool2 -> pool2 I1108 13:24:31.098520 3340 net.cpp:120] Setting up pool2 I1108 13:24:31.098584 3340 net.cpp:127] Top shape: 1 128 56 56 (401408) I1108 13:24:31.098590 3340 layer_factory.hpp:74] Creating layer conv3_1 I1108 13:24:31.098598 3340 net.cpp:90] Creating Layer conv3_1 I1108 13:24:31.098626 3340 net.cpp:420] conv3_1 <- pool2 I1108 13:24:31.098636 3340 net.cpp:378] conv3_1 -> conv3_1 I1108 13:24:31.098644 3340 net.cpp:120] Setting up conv3_1 I1108 13:24:31.099407 3340 net.cpp:127] Top shape: 1 256 56 56 (802816) I1108 13:24:31.099421 3340 layer_factory.hpp:74] Creating layer relu3_1 I1108 13:24:31.099426 3340 net.cpp:90] Creating Layer relu3_1 I1108 13:24:31.099431 3340 net.cpp:420] relu3_1 <- conv3_1 I1108 13:24:31.099438 3340 net.cpp:367] relu3_1 -> conv3_1 (in-place) I1108 13:24:31.099444 3340 net.cpp:120] Setting up relu3_1 I1108 13:24:31.099611 3340 net.cpp:127] Top shape: 1 256 56 56 (802816) I1108 13:24:31.099618 3340 layer_factory.hpp:74] Creating layer conv3_2 I1108 13:24:31.099627 3340 net.cpp:90] Creating Layer conv3_2 I1108 13:24:31.099630 3340 net.cpp:420] conv3_2 <- conv3_1 I1108 13:24:31.099637 3340 net.cpp:378] conv3_2 -> conv3_2 I1108 13:24:31.099645 3340 net.cpp:120] Setting up conv3_2 I1108 13:24:31.100844 3340 net.cpp:127] Top shape: 1 256 56 56 (802816) I1108 13:24:31.100857 3340 layer_factory.hpp:74] Creating layer relu3_2 I1108 13:24:31.100862 3340 net.cpp:90] Creating Layer relu3_2 I1108 13:24:31.100867 3340 net.cpp:420] relu3_2 <- conv3_2 I1108 13:24:31.100874 3340 net.cpp:367] relu3_2 -> conv3_2 (in-place) I1108 13:24:31.100880 3340 net.cpp:120] Setting up relu3_2 I1108 13:24:31.100950 3340 net.cpp:127] Top shape: 1 256 56 56 (802816) I1108 13:24:31.100955 3340 layer_factory.hpp:74] Creating layer conv3_3 I1108 13:24:31.100963 3340 net.cpp:90] Creating Layer conv3_3 I1108 13:24:31.100968 3340 net.cpp:420] conv3_3 <- conv3_2 I1108 13:24:31.100975 3340 net.cpp:378] conv3_3 -> conv3_3 I1108 13:24:31.100981 3340 net.cpp:120] Setting up conv3_3 I1108 13:24:31.102169 3340 net.cpp:127] Top shape: 1 256 56 56 (802816) I1108 13:24:31.102181 3340 layer_factory.hpp:74] Creating layer relu3_3 I1108 13:24:31.102190 3340 net.cpp:90] Creating Layer relu3_3 I1108 13:24:31.102195 3340 net.cpp:420] relu3_3 <- conv3_3 I1108 13:24:31.102200 3340 net.cpp:367] relu3_3 -> conv3_3 (in-place) I1108 13:24:31.102205 3340 net.cpp:120] Setting up relu3_3 I1108 13:24:31.102267 3340 net.cpp:127] Top shape: 1 256 56 56 (802816) I1108 13:24:31.102272 3340 layer_factory.hpp:74] Creating layer pool3 I1108 13:24:31.102284 3340 net.cpp:90] Creating Layer pool3 I1108 13:24:31.102288 3340 net.cpp:420] pool3 <- conv3_3 I1108 13:24:31.102294 3340 net.cpp:378] pool3 -> pool3 I1108 13:24:31.102300 3340 net.cpp:120] Setting up pool3 I1108 13:24:31.102365 3340 net.cpp:127] Top shape: 1 256 28 28 (200704) I1108 13:24:31.102371 3340 layer_factory.hpp:74] Creating layer conv4_1 I1108 13:24:31.102377 3340 net.cpp:90] Creating Layer conv4_1 I1108 13:24:31.102381 3340 net.cpp:420] conv4_1 <- pool3 I1108 13:24:31.102387 3340 net.cpp:378] conv4_1 -> conv4_1 I1108 13:24:31.102393 3340 net.cpp:120] Setting up conv4_1 I1108 13:24:31.104051 3340 net.cpp:127] Top shape: 1 512 28 28 (401408) I1108 13:24:31.104074 3340 layer_factory.hpp:74] Creating layer relu4_1 I1108 13:24:31.104084 3340 net.cpp:90] Creating Layer relu4_1 I1108 13:24:31.104087 3340 net.cpp:420] relu4_1 <- conv4_1 I1108 13:24:31.104095 3340 net.cpp:367] relu4_1 -> conv4_1 (in-place) I1108 13:24:31.104104 3340 net.cpp:120] Setting up relu4_1 I1108 13:24:31.104265 3340 net.cpp:127] Top shape: 1 512 28 28 (401408) I1108 13:24:31.104272 3340 layer_factory.hpp:74] Creating layer conv4_2 I1108 13:24:31.104279 3340 net.cpp:90] Creating Layer conv4_2 I1108 13:24:31.104285 3340 net.cpp:420] conv4_2 <- conv4_1 I1108 13:24:31.104292 3340 net.cpp:378] conv4_2 -> conv4_2 I1108 13:24:31.104300 3340 net.cpp:120] Setting up conv4_2 I1108 13:24:31.107848 3340 net.cpp:127] Top shape: 1 512 28 28 (401408) I1108 13:24:31.107877 3340 layer_factory.hpp:74] Creating layer relu4_2 I1108 13:24:31.107888 3340 net.cpp:90] Creating Layer relu4_2 I1108 13:24:31.107893 3340 net.cpp:420] relu4_2 <- conv4_2 I1108 13:24:31.107899 3340 net.cpp:367] relu4_2 -> conv4_2 (in-place) I1108 13:24:31.107906 3340 net.cpp:120] Setting up relu4_2 I1108 13:24:31.108000 3340 net.cpp:127] Top shape: 1 512 28 28 (401408) I1108 13:24:31.108006 3340 layer_factory.hpp:74] Creating layer conv4_3 I1108 13:24:31.108014 3340 net.cpp:90] Creating Layer conv4_3 I1108 13:24:31.108018 3340 net.cpp:420] conv4_3 <- conv4_2 I1108 13:24:31.108026 3340 net.cpp:378] conv4_3 -> conv4_3 I1108 13:24:31.108033 3340 net.cpp:120] Setting up conv4_3 I1108 13:24:31.111701 3340 net.cpp:127] Top shape: 1 512 28 28 (401408) I1108 13:24:31.111728 3340 layer_factory.hpp:74] Creating layer relu4_3 I1108 13:24:31.111737 3340 net.cpp:90] Creating Layer relu4_3 I1108 13:24:31.111743 3340 net.cpp:420] relu4_3 <- conv4_3 I1108 13:24:31.111752 3340 net.cpp:367] relu4_3 -> conv4_3 (in-place) I1108 13:24:31.111759 3340 net.cpp:120] Setting up relu4_3 I1108 13:24:31.111820 3340 net.cpp:127] Top shape: 1 512 28 28 (401408) I1108 13:24:31.111825 3340 layer_factory.hpp:74] Creating layer pool4 I1108 13:24:31.111834 3340 net.cpp:90] Creating Layer pool4 I1108 13:24:31.111837 3340 net.cpp:420] pool4 <- conv4_3 I1108 13:24:31.111842 3340 net.cpp:378] pool4 -> pool4 I1108 13:24:31.111848 3340 net.cpp:120] Setting up pool4 I1108 13:24:31.111912 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.111917 3340 layer_factory.hpp:74] Creating layer conv5_1 I1108 13:24:31.111927 3340 net.cpp:90] Creating Layer conv5_1 I1108 13:24:31.111932 3340 net.cpp:420] conv5_1 <- pool4 I1108 13:24:31.111937 3340 net.cpp:378] conv5_1 -> conv5_1 I1108 13:24:31.111943 3340 net.cpp:120] Setting up conv5_1 I1108 13:24:31.115190 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.115217 3340 layer_factory.hpp:74] Creating layer relu5_1 I1108 13:24:31.115227 3340 net.cpp:90] Creating Layer relu5_1 I1108 13:24:31.115232 3340 net.cpp:420] relu5_1 <- conv5_1 I1108 13:24:31.115241 3340 net.cpp:367] relu5_1 -> conv5_1 (in-place) I1108 13:24:31.115248 3340 net.cpp:120] Setting up relu5_1 I1108 13:24:31.115413 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.115420 3340 layer_factory.hpp:74] Creating layer conv5_2 I1108 13:24:31.115429 3340 net.cpp:90] Creating Layer conv5_2 I1108 13:24:31.115433 3340 net.cpp:420] conv5_2 <- conv5_1 I1108 13:24:31.115439 3340 net.cpp:378] conv5_2 -> conv5_2 I1108 13:24:31.115448 3340 net.cpp:120] Setting up conv5_2 I1108 13:24:31.119441 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.119467 3340 layer_factory.hpp:74] Creating layer relu5_2 I1108 13:24:31.119477 3340 net.cpp:90] Creating Layer relu5_2 I1108 13:24:31.119483 3340 net.cpp:420] relu5_2 <- conv5_2 I1108 13:24:31.119489 3340 net.cpp:367] relu5_2 -> conv5_2 (in-place) I1108 13:24:31.119498 3340 net.cpp:120] Setting up relu5_2 I1108 13:24:31.119560 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.119565 3340 layer_factory.hpp:74] Creating layer conv5_3 I1108 13:24:31.119575 3340 net.cpp:90] Creating Layer conv5_3 I1108 13:24:31.119578 3340 net.cpp:420] conv5_3 <- conv5_2 I1108 13:24:31.119585 3340 net.cpp:378] conv5_3 -> conv5_3 I1108 13:24:31.119592 3340 net.cpp:120] Setting up conv5_3 I1108 13:24:31.122876 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.122901 3340 layer_factory.hpp:74] Creating layer relu5_3 I1108 13:24:31.122912 3340 net.cpp:90] Creating Layer relu5_3 I1108 13:24:31.122917 3340 net.cpp:420] relu5_3 <- conv5_3 I1108 13:24:31.122925 3340 net.cpp:367] relu5_3 -> conv5_3 (in-place) I1108 13:24:31.122932 3340 net.cpp:120] Setting up relu5_3 I1108 13:24:31.122993 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.122998 3340 layer_factory.hpp:74] Creating layer conv_proposal1 I1108 13:24:31.123008 3340 net.cpp:90] Creating Layer conv_proposal1 I1108 13:24:31.123013 3340 net.cpp:420] conv_proposal1 <- conv5_3 I1108 13:24:31.123018 3340 net.cpp:378] conv_proposal1 -> conv_proposal1 I1108 13:24:31.123026 3340 net.cpp:120] Setting up conv_proposal1 I1108 13:24:31.159159 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.159188 3340 layer_factory.hpp:74] Creating layer relu_proposal1 I1108 13:24:31.159224 3340 net.cpp:90] Creating Layer relu_proposal1 I1108 13:24:31.159231 3340 net.cpp:420] relu_proposal1 <- conv_proposal1 I1108 13:24:31.159240 3340 net.cpp:367] relu_proposal1 -> conv_proposal1 (in-place) I1108 13:24:31.159250 3340 net.cpp:120] Setting up relu_proposal1 I1108 13:24:31.159416 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.159425 3340 layer_factory.hpp:74] Creating layer conv_proposal1_relu_proposal1_0_split I1108 13:24:31.159441 3340 net.cpp:90] Creating Layer conv_proposal1_relu_proposal1_0_split I1108 13:24:31.159446 3340 net.cpp:420] conv_proposal1_relu_proposal1_0_split <- conv_proposal1 I1108 13:24:31.159451 3340 net.cpp:378] conv_proposal1_relu_proposal1_0_split -> conv_proposal1_relu_proposal1_0_split_0 I1108 13:24:31.159459 3340 net.cpp:378] conv_proposal1_relu_proposal1_0_split -> conv_proposal1_relu_proposal1_0_split_1 I1108 13:24:31.159466 3340 net.cpp:120] Setting up conv_proposal1_relu_proposal1_0_split I1108 13:24:31.159473 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.159478 3340 net.cpp:127] Top shape: 1 512 14 14 (100352) I1108 13:24:31.159482 3340 layer_factory.hpp:74] Creating layer proposal_cls_score I1108 13:24:31.159493 3340 net.cpp:90] Creating Layer proposal_cls_score I1108 13:24:31.159497 3340 net.cpp:420] proposal_cls_score <- conv_proposal1_relu_proposal1_0_split_0 I1108 13:24:31.159505 3340 net.cpp:378] proposal_cls_score -> proposal_cls_score I1108 13:24:31.159512 3340 net.cpp:120] Setting up proposal_cls_score I1108 13:24:31.159975 3340 net.cpp:127] Top shape: 1 18 14 14 (3528) I1108 13:24:31.159986 3340 layer_factory.hpp:74] Creating layer proposal_bbox_pred I1108 13:24:31.159996 3340 net.cpp:90] Creating Layer proposal_bbox_pred I1108 13:24:31.160001 3340 net.cpp:420] proposal_bbox_pred <- conv_proposal1_relu_proposal1_0_split_1 I1108 13:24:31.160008 3340 net.cpp:378] proposal_bbox_pred -> proposal_bbox_pred I1108 13:24:31.160015 3340 net.cpp:120] Setting up proposal_bbox_pred I1108 13:24:31.160611 3340 net.cpp:127] Top shape: 1 36 14 14 (7056) I1108 13:24:31.160621 3340 layer_factory.hpp:74] Creating layer proposal_cls_score_reshape I1108 13:24:31.160632 3340 net.cpp:90] Creating Layer proposal_cls_score_reshape I1108 13:24:31.160637 3340 net.cpp:420] proposal_cls_score_reshape <- proposal_cls_score I1108 13:24:31.160643 3340 net.cpp:378] proposal_cls_score_reshape -> proposal_cls_score_reshape I1108 13:24:31.160650 3340 net.cpp:120] Setting up proposal_cls_score_reshape I1108 13:24:31.160660 3340 net.cpp:127] Top shape: 1 2 126 14 (3528) I1108 13:24:31.160665 3340 layer_factory.hpp:74] Creating layer proposal_cls_prob I1108 13:24:31.160672 3340 net.cpp:90] Creating Layer proposal_cls_prob I1108 13:24:31.160677 3340 net.cpp:420] proposal_cls_prob <- proposal_cls_score_reshape I1108 13:24:31.160682 3340 net.cpp:378] proposal_cls_prob -> proposal_cls_prob I1108 13:24:31.160688 3340 net.cpp:120] Setting up proposal_cls_prob I1108 13:24:31.160770 3340 net.cpp:127] Top shape: 1 2 126 14 (3528) I1108 13:24:31.160776 3340 net.cpp:194] proposal_cls_prob does not need backward computation. I1108 13:24:31.160781 3340 net.cpp:194] proposal_cls_score_reshape does not need backward computation. I1108 13:24:31.160785 3340 net.cpp:194] proposal_bbox_pred does not need backward computation. I1108 13:24:31.160789 3340 net.cpp:194] proposal_cls_score does not need backward computation. I1108 13:24:31.160792 3340 net.cpp:194] conv_proposal1_relu_proposal1_0_split does not need backward computation. I1108 13:24:31.160796 3340 net.cpp:194] relu_proposal1 does not need backward computation. I1108 13:24:31.160800 3340 net.cpp:194] conv_proposal1 does not need backward computation. I1108 13:24:31.160804 3340 net.cpp:194] relu5_3 does not need backward computation. I1108 13:24:31.160807 3340 net.cpp:194] conv5_3 does not need backward computation. I1108 13:24:31.160811 3340 net.cpp:194] relu5_2 does not need backward computation. I1108 13:24:31.160815 3340 net.cpp:194] conv5_2 does not need backward computation. I1108 13:24:31.160830 3340 net.cpp:194] relu5_1 does not need backward computation. I1108 13:24:31.160835 3340 net.cpp:194] conv5_1 does not need backward computation. I1108 13:24:31.160838 3340 net.cpp:194] pool4 does not need backward computation. I1108 13:24:31.160842 3340 net.cpp:194] relu4_3 does not need backward computation. I1108 13:24:31.160846 3340 net.cpp:194] conv4_3 does not need backward computation. I1108 13:24:31.160851 3340 net.cpp:194] relu4_2 does not need backward computation. I1108 13:24:31.160854 3340 net.cpp:194] conv4_2 does not need backward computation. I1108 13:24:31.160858 3340 net.cpp:194] relu4_1 does not need backward computation. I1108 13:24:31.160861 3340 net.cpp:194] conv4_1 does not need backward computation. I1108 13:24:31.160866 3340 net.cpp:194] pool3 does not need backward computation. I1108 13:24:31.160869 3340 net.cpp:194] relu3_3 does not need backward computation. I1108 13:24:31.160873 3340 net.cpp:194] conv3_3 does not need backward computation. I1108 13:24:31.160877 3340 net.cpp:194] relu3_2 does not need backward computation. I1108 13:24:31.160881 3340 net.cpp:194] conv3_2 does not need backward computation. I1108 13:24:31.160884 3340 net.cpp:194] relu3_1 does not need backward computation. I1108 13:24:31.160888 3340 net.cpp:194] conv3_1 does not need backward computation. I1108 13:24:31.160892 3340 net.cpp:194] pool2 does not need backward computation. I1108 13:24:31.160895 3340 net.cpp:194] relu2_2 does not need backward computation. I1108 13:24:31.160899 3340 net.cpp:194] conv2_2 does not need backward computation. I1108 13:24:31.160903 3340 net.cpp:194] relu2_1 does not need backward computation. I1108 13:24:31.160907 3340 net.cpp:194] conv2_1 does not need backward computation. I1108 13:24:31.160910 3340 net.cpp:194] pool1 does not need backward computation. I1108 13:24:31.160914 3340 net.cpp:194] relu1_2 does not need backward computation. I1108 13:24:31.160918 3340 net.cpp:194] conv1_2 does not need backward computation. I1108 13:24:31.160923 3340 net.cpp:194] relu1_1 does not need backward computation. I1108 13:24:31.160925 3340 net.cpp:194] conv1_1 does not need backward computation. I1108 13:24:31.160929 3340 net.cpp:235] This network produces output proposal_bbox_pred I1108 13:24:31.160933 3340 net.cpp:235] This network produces output proposal_cls_prob I1108 13:24:31.160955 3340 net.cpp:492] Collecting Learning Rate and Weight Decay. I1108 13:24:31.160964 3340 net.cpp:247] Network initialization done. I1108 13:24:31.160969 3340 net.cpp:248] Memory required for data: 116077472 I1108 13:24:31.908174 3340 net.cpp:746] Copying source layer conv1_1 I1108 13:24:31.908220 3340 net.cpp:746] Copying source layer relu1_1 I1108 13:24:31.908229 3340 net.cpp:746] Copying source layer conv1_2 I1108 13:24:31.908289 3340 net.cpp:746] Copying source layer relu1_2 I1108 13:24:31.908298 3340 net.cpp:746] Copying source layer pool1 I1108 13:24:31.908305 3340 net.cpp:746] Copying source layer conv2_1 I1108 13:24:31.908414 3340 net.cpp:746] Copying source layer relu2_1 I1108 13:24:31.908423 3340 net.cpp:746] Copying source layer conv2_2 I1108 13:24:31.908630 3340 net.cpp:746] Copying source layer relu2_2 I1108 13:24:31.908640 3340 net.cpp:746] Copying source layer pool2 I1108 13:24:31.908646 3340 net.cpp:746] Copying source layer conv3_1 I1108 13:24:31.909065 3340 net.cpp:746] Copying source layer relu3_1 I1108 13:24:31.909078 3340 net.cpp:746] Copying source layer conv3_2 I1108 13:24:31.909947 3340 net.cpp:746] Copying source layer relu3_2 I1108 13:24:31.909965 3340 net.cpp:746] Copying source layer conv3_3 I1108 13:24:31.910823 3340 net.cpp:746] Copying source layer relu3_3 I1108 13:24:31.910838 3340 net.cpp:746] Copying source layer pool3 I1108 13:24:31.910845 3340 net.cpp:746] Copying source layer conv4_1 I1108 13:24:31.912566 3340 net.cpp:746] Copying source layer relu4_1 I1108 13:24:31.912592 3340 net.cpp:746] Copying source layer conv4_2 I1108 13:24:31.915818 3340 net.cpp:746] Copying source layer relu4_2 I1108 13:24:31.915887 3340 net.cpp:746] Copying source layer conv4_3 I1108 13:24:31.919167 3340 net.cpp:746] Copying source layer relu4_3 I1108 13:24:31.919193 3340 net.cpp:746] Copying source layer pool4 I1108 13:24:31.919199 3340 net.cpp:746] Copying source layer conv5_1 I1108 13:24:31.922497 3340 net.cpp:746] Copying source layer relu5_1 I1108 13:24:31.922523 3340 net.cpp:746] Copying source layer conv5_2 I1108 13:24:31.925660 3340 net.cpp:746] Copying source layer relu5_2 I1108 13:24:31.925683 3340 net.cpp:746] Copying source layer conv5_3 I1108 13:24:31.928249 3340 net.cpp:746] Copying source layer relu5_3 I1108 13:24:31.928272 3340 net.cpp:746] Copying source layer conv_proposal1 I1108 13:24:31.931059 3340 net.cpp:746] Copying source layer relu_proposal1 I1108 13:24:31.931080 3340 net.cpp:746] Copying source layer conv_proposal1_relu_proposal1_0_split I1108 13:24:31.931084 3340 net.cpp:746] Copying source layer proposal_cls_score I1108 13:24:31.931102 3340 net.cpp:746] Copying source layer proposal_bbox_pred I1108 13:24:31.931128 3340 net.cpp:746] Copying source layer proposal_cls_score_reshape I1108 13:24:31.931131 3340 net.cpp:746] Copying source layer proposal_cls_prob I1108 13:24:31.934445 3340 net.cpp:42] Initializing net from parameters: name: "VGG_ILSVRC_16" input: "data" input: "rois" input_dim: 1 input_dim: 512 input_dim: 50 input_dim: 50 input_dim: 1 input_dim: 5 input_dim: 1 input_dim: 1 state { phase: TEST } layer { name: "roi_pool5" type: "ROIPooling" bottom: "data" bottom: "rois" top: "pool5" roi_pooling_param { pooled_h: 7 pooled_w: 7 spatial_scale: 0.0625 } } layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" param { lr_mult: 1 } param { lr_mult: 2 } inner_product_param { num_output: 4096 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" param { lr_mult: 1 } param { lr_mult: 2 } inner_product_param { num_output: 4096 } } layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7" } layer { name: "drop7" type: "Dropout" bottom: "fc7" top: "fc7" dropout_param { dropout_ratio: 0.5 } } layer { name: "cls_score" type: "InnerProduct" bottom: "fc7" top: "cls_score" param { lr_mult: 1 } param { lr_mult: 2 } inner_product_param { num_output: 21 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0 } } } layer { name: "bbox_pred" type: "InnerProduct" bottom: "fc7" top: "bbox_pred" param { lr_mult: 1 } param { lr_mult: 2 } inner_product_param { num_output: 84 weight_filler { type: "gaussian" std: 0.001 } bias_filler { type: "constant" value: 0 } } } layer { name: "cls_prob" type: "Softmax" bottom: "cls_score" top: "cls_prob" loss_weight: 1 } I1108 13:24:31.934500 3340 net.cpp:380] Input 0 -> data I1108 13:24:31.934512 3340 net.cpp:380] Input 1 -> rois I1108 13:24:31.934521 3340 layer_factory.hpp:74] Creating layer roi_pool5 I1108 13:24:31.934944 3340 net.cpp:90] Creating Layer roi_pool5 I1108 13:24:31.934958 3340 net.cpp:420] roi_pool5 <- data I1108 13:24:31.934967 3340 net.cpp:420] roi_pool5 <- rois I1108 13:24:31.934975 3340 net.cpp:378] roi_pool5 -> pool5 I1108 13:24:31.934985 3340 net.cpp:120] Setting up roi_pool5 I1108 13:24:31.934994 3340 roi_pooling_layer.cpp:44] Spatial scale: 0.0625 I1108 13:24:31.935014 3340 net.cpp:127] Top shape: 1 512 7 7 (25088) I1108 13:24:31.935019 3340 layer_factory.hpp:74] Creating layer fc6 I1108 13:24:31.935026 3340 net.cpp:90] Creating Layer fc6 I1108 13:24:31.935030 3340 net.cpp:420] fc6 <- pool5 I1108 13:24:31.935035 3340 net.cpp:378] fc6 -> fc6 I1108 13:24:31.935042 3340 net.cpp:120] Setting up fc6 I1108 13:24:32.092841 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.092890 3340 layer_factory.hpp:74] Creating layer relu6 I1108 13:24:32.092901 3340 net.cpp:90] Creating Layer relu6 I1108 13:24:32.092907 3340 net.cpp:420] relu6 <- fc6 I1108 13:24:32.092923 3340 net.cpp:367] relu6 -> fc6 (in-place) I1108 13:24:32.092931 3340 net.cpp:120] Setting up relu6 I1108 13:24:32.093122 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.093128 3340 layer_factory.hpp:74] Creating layer drop6 I1108 13:24:32.093160 3340 net.cpp:90] Creating Layer drop6 I1108 13:24:32.093165 3340 net.cpp:420] drop6 <- fc6 I1108 13:24:32.093171 3340 net.cpp:367] drop6 -> fc6 (in-place) I1108 13:24:32.093178 3340 net.cpp:120] Setting up drop6 I1108 13:24:32.093190 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.093194 3340 layer_factory.hpp:74] Creating layer fc7 I1108 13:24:32.093206 3340 net.cpp:90] Creating Layer fc7 I1108 13:24:32.093210 3340 net.cpp:420] fc7 <- fc6 I1108 13:24:32.093217 3340 net.cpp:378] fc7 -> fc7 I1108 13:24:32.093225 3340 net.cpp:120] Setting up fc7 I1108 13:24:32.116015 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.116042 3340 layer_factory.hpp:74] Creating layer relu7 I1108 13:24:32.116052 3340 net.cpp:90] Creating Layer relu7 I1108 13:24:32.116057 3340 net.cpp:420] relu7 <- fc7 I1108 13:24:32.116063 3340 net.cpp:367] relu7 -> fc7 (in-place) I1108 13:24:32.116071 3340 net.cpp:120] Setting up relu7 I1108 13:24:32.116168 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.116174 3340 layer_factory.hpp:74] Creating layer drop7 I1108 13:24:32.116181 3340 net.cpp:90] Creating Layer drop7 I1108 13:24:32.116185 3340 net.cpp:420] drop7 <- fc7 I1108 13:24:32.116190 3340 net.cpp:367] drop7 -> fc7 (in-place) I1108 13:24:32.116195 3340 net.cpp:120] Setting up drop7 I1108 13:24:32.116202 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.116206 3340 layer_factory.hpp:74] Creating layer fc7_drop7_0_split I1108 13:24:32.116212 3340 net.cpp:90] Creating Layer fc7_drop7_0_split I1108 13:24:32.116215 3340 net.cpp:420] fc7_drop7_0_split <- fc7 I1108 13:24:32.116220 3340 net.cpp:378] fc7_drop7_0_split -> fc7_drop7_0_split_0 I1108 13:24:32.116227 3340 net.cpp:378] fc7_drop7_0_split -> fc7_drop7_0_split_1 I1108 13:24:32.116233 3340 net.cpp:120] Setting up fc7_drop7_0_split I1108 13:24:32.116240 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.116245 3340 net.cpp:127] Top shape: 1 4096 (4096) I1108 13:24:32.116248 3340 layer_factory.hpp:74] Creating layer cls_score I1108 13:24:32.116257 3340 net.cpp:90] Creating Layer cls_score I1108 13:24:32.116261 3340 net.cpp:420] cls_score <- fc7_drop7_0_split_0 I1108 13:24:32.116267 3340 net.cpp:378] cls_score -> cls_score I1108 13:24:32.116273 3340 net.cpp:120] Setting up cls_score I1108 13:24:32.117563 3340 net.cpp:127] Top shape: 1 21 (21) I1108 13:24:32.117573 3340 layer_factory.hpp:74] Creating layer bbox_pred I1108 13:24:32.117579 3340 net.cpp:90] Creating Layer bbox_pred I1108 13:24:32.117584 3340 net.cpp:420] bbox_pred <- fc7_drop7_0_split_1 I1108 13:24:32.117590 3340 net.cpp:378] bbox_pred -> bbox_pred I1108 13:24:32.117597 3340 net.cpp:120] Setting up bbox_pred I1108 13:24:32.122704 3340 net.cpp:127] Top shape: 1 84 (84) I1108 13:24:32.122714 3340 layer_factory.hpp:74] Creating layer cls_prob I1108 13:24:32.122721 3340 net.cpp:90] Creating Layer cls_prob I1108 13:24:32.122726 3340 net.cpp:420] cls_prob <- cls_score I1108 13:24:32.122731 3340 net.cpp:378] cls_prob -> cls_prob I1108 13:24:32.122737 3340 net.cpp:120] Setting up cls_prob I1108 13:24:32.122993 3340 net.cpp:127] Top shape: 1 21 (21) I1108 13:24:32.123002 3340 net.cpp:129] with loss weight 1 I1108 13:24:32.123014 3340 net.cpp:192] cls_prob needs backward computation. I1108 13:24:32.123019 3340 net.cpp:194] bbox_pred does not need backward computation. I1108 13:24:32.123023 3340 net.cpp:192] cls_score needs backward computation. I1108 13:24:32.123026 3340 net.cpp:192] fc7_drop7_0_split n

northeastsquare commented 8 years ago

@ShaoqingRen And, the matlab details: MATLAB crash file:/home/silva/matlab_crash_dump.3291-1:


   Segmentation violation detected at Sun Nov  8 13:24:40 2015

Configuration: Crash Decoding : Disabled Current Visual : 0x21 (class 4, depth 24) Default Encoding : UTF-8 GNU C Library : 2.19 stable MATLAB Architecture: glnxa64 MATLAB Root : /usr/local/MATLAB/R2014a MATLAB Version : 8.3.0.532 (R2014a) Operating System : Linux 3.13.0-55-generic #94-Ubuntu SMP Thu Jun 18 00:27:10 UTC 2015 x86_64 Processor ID : x86 Family 6 Model 42 Stepping 7, GenuineIntel Virtual Machine : Java 1.7.0_11-b21 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode Window System : The X.Org Foundation (11501000), display :0

Fault Count: 1

Abnormal termination: Segmentation violation

Register State (from fault): RAX = ffff80f23b811c70 RBX = 00007f15fcdeed30 RCX = 0000000000afc800 RDX = 000000000057e400 RSP = 00007f16dde05058 RBP = 00007f16dde050b0 RSI = 0000000808fc0000 RDI = 00007f15cd22ff90

R8 = 0000000000000000 R9 = 0000000001250000 R10 = 00007f16d0000078 R11 = 00000000d0000001 R12 = 00007f15f8a21870 R13 = 0000000808fc0000 R14 = 00007f15fa5eacd0 R15 = 0000000000000000

RIP = 00007f16f02cfff0 EFL = 0000000000010206

CS = 0033 FS = 0000 GS = 0000

Stack Trace (from fault): [ 0] 0x00007f16f02cfff0 /lib/x86_64-linux-gnu/libc.so.6+00622576 [ 1] 0x00007f161c51c094 /home/silva/cxxSrc/fasterrcnn/external/caffe/matlab/+caffe/private/caffe.mexa64+00286868 [ 2] 0x00007f161c51c170 /home/silva/cxxSrc/fasterrcnn/external/caffe/matlab/+caffe/private/caffe.mexa64+00287088 [ 3] 0x00007f161c51c513 /home/silva/cxxSrc/fasterrcnn/external/caffe/matlab/+caffe/private/caffe.mexa64+00288019 mexFunction+00000168 [ 4] 0x00007f16e816772a /usr/local/MATLAB/R2014a/bin/glnxa64/libmex.so+00120618 mexRunMexFile+00000090 [ 5] 0x00007f16e8163a94 /usr/local/MATLAB/R2014a/bin/glnxa64/libmex.so+00105108 [ 6] 0x00007f16e8164fb4 /usr/local/MATLAB/R2014a/bin/glnxa64/libmex.so+00110516 [ 7] 0x00007f16e755ead9 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 8] 0x00007f16e663520e /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 9] 0x00007f16e65f01d0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 10] 0x00007f16e65f21ea /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02327018 [ 11] 0x00007f16e65f5167 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 12] 0x00007f16e65f326f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 13] 0x00007f16e65f3ec4 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 14] 0x00007f16e665130b /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 15] 0x00007f16e755ead9 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 16] 0x00007f16e71966e8 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01566440 [ 17] 0x00007f16e7140a02 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01214978 [ 18] 0x00007f16e714238e /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01221518 [ 19] 0x00007f16e7144e50 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01232464 [ 20] 0x00007f16e714273d /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01222461 [ 21] 0x00007f16e72041f0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+02015728 [ 22] 0x00007f16e750d874 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00338036 _ZN13Mfh_MATLAB_fn11dispatch_fhEiPP11mxArraytagiS2+00000244 [ 23] 0x00007f16e7205031 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+02019377 [ 24] 0x00007f16e6634e4e /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02600526 [ 25] 0x00007f16e6648524 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02680100 [ 26] 0x00007f16e66489f0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02681328 [ 27] 0x00007f16e66499d6 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02685398 [ 28] 0x00007f16e65db046 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02232390 [ 29] 0x00007f16e6648fb5 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02682805 [ 30] 0x00007f16e66499d6 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02685398 [ 31] 0x00007f16e65db046 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02232390 [ 32] 0x00007f16e65d5a15 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02210325 [ 33] 0x00007f16e65f1233 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02322995 [ 34] 0x00007f16e65f5167 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 35] 0x00007f16e65f326f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 36] 0x00007f16e65f3ec4 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 37] 0x00007f16e665130b /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 38] 0x00007f16e755ead9 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 39] 0x00007f16e71966e8 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01566440 [ 40] 0x00007f16e7140a02 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01214978 [ 41] 0x00007f16e714238e /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01221518 [ 42] 0x00007f16e7144e50 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01232464 [ 43] 0x00007f16e714273d /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+01222461 [ 44] 0x00007f16e72041f0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+02015728 [ 45] 0x00007f16e750d874 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00338036 _ZN13Mfh_MATLAB_fn11dispatch_fhEiPP11mxArraytagiS2+00000244 [ 46] 0x00007f16e7205031 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcos.so+02019377 [ 47] 0x00007f16e6634e4e /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02600526 [ 48] 0x00007f16e6648524 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02680100 [ 49] 0x00007f16e66489f0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02681328 [ 50] 0x00007f16e66499d6 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02685398 [ 51] 0x00007f16e65db046 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02232390 [ 52] 0x00007f16e65d5a15 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02210325 [ 53] 0x00007f16e65f1233 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02322995 [ 54] 0x00007f16e65f5167 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 55] 0x00007f16e65f326f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 56] 0x00007f16e65f3ec4 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 57] 0x00007f16e665130b /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 58] 0x00007f16e755ead9 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 59] 0x00007f16e663520e /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 60] 0x00007f16e65f01d0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02318800 [ 61] 0x00007f16e65f21ea /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02327018 [ 62] 0x00007f16e65f5167 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 63] 0x00007f16e65f326f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 64] 0x00007f16e65f3ec4 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 65] 0x00007f16e665130b /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 66] 0x00007f16e755ead9 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00670425 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00000697 [ 67] 0x00007f16e663520e /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02601486 [ 68] 0x00007f16e65d61b0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02212272 [ 69] 0x00007f16e65f125f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02323039 [ 70] 0x00007f16e65f5167 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02339175 [ 71] 0x00007f16e65f326f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02331247 [ 72] 0x00007f16e65f3ec4 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02334404 [ 73] 0x00007f16e665130b /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02716427 [ 74] 0x00007f16e755ec5f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_dispatcher.so+00670815 _ZN8Mfh_file11dispatch_fhEiPP11mxArraytagiS2+00001087 [ 75] 0x00007f16e6624135 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02531637 [ 76] 0x00007f16e65eb0d9 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02298073 [ 77] 0x00007f16e65e7dc7 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02284999 [ 78] 0x00007f16e65e8193 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwm_interpreter.so+02285971 [ 79] 0x00007f16e8391afc /usr/local/MATLAB/R2014a/bin/glnxa64/libmwbridge.so+00142076 [ 80] 0x00007f16e8392791 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwbridge.so+00145297 _Z8mnParserv+00000721 [ 81] 0x00007f16f167a92f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcr.so+00489775 _ZN11mcrInstance30mnParser_on_interpreter_threadEv+00000031 [ 82] 0x00007f16f165bb6d /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcr.so+00363373 [ 83] 0x00007f16f165bbe9 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcr.so+00363497 [ 84] 0x00007f16e5d1dd46 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwuix.so+00343366 [ 85] 0x00007f16e5d00382 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwuix.so+00222082 [ 86] 0x00007f16f1dd050f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwservices.so+02323727 [ 87] 0x00007f16f1dd067c /usr/local/MATLAB/R2014a/bin/glnxa64/libmwservices.so+02324092 [ 88] 0x00007f16f1dcc57f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwservices.so+02307455 [ 89] 0x00007f16f1dd19b5 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwservices.so+02329013 [ 90] 0x00007f16f1dd1de7 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwservices.so+02330087 [ 91] 0x00007f16f1dd24c0 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwservices.so+02331840 _Z25svWS_ProcessPendingEventsiib+00000080 [ 92] 0x00007f16f165c098 /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcr.so+00364696 [ 93] 0x00007f16f165c3bf /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcr.so+00365503 [ 94] 0x00007f16f165728f /usr/local/MATLAB/R2014a/bin/glnxa64/libmwmcr.so+00344719 [ 95] 0x00007f16f0605182 /lib/x86_64-linux-gnu/libpthread.so.0+00033154 [ 96] 0x00007f16f033247d /lib/x86_64-linux-gnu/libc.so.6+01025149 clone+00000109

This error was detected while a MEX-file was running. If the MEX-file is not an official MathWorks function, please examine its source code for errors. Please consult the External Interfaces Guide for information on debugging MEX-files.

If this problem is reproducible, please submit a Service Request via: http://www.mathworks.com/support/contact_us/

A technical support engineer might contact you with further information.

Thank you for your help.

northeastsquare commented 8 years ago

If use, cpu mode , everything goes well.

northeastsquare commented 8 years ago

comment the line in makefile.config solve my problem: USE_CUDNN := 1 But why?

sepidehhosseinzadeh commented 8 years ago

I also ran into the error: Invalid MEX-file 'fasterrcnn/experiments/external/caffe/matlab/+caffe/private/caffe.mexa64': libcudart.so.6.5: cannot open shared object file: No such file or directory

Does anyone solved this problem? @euwern @ShaoqingRen

sepidehhosseinzadeh commented 8 years ago

@euwern Did you solve your problem? I appreciate your help.

euwern commented 8 years ago

no i did not, I just switch to the python version of faster rcnn, and it works fine for me. https://github.com/rbgirshick/py-faster-rcnn

note: py-faster-rcnn is working for me on ubuntu. remember to login to your terminal with ssh -X, when trying out the demo.

sepidehhosseinzadeh commented 8 years ago

@euwern Could you explain what steps I need to make the python version run? Thanks

a980410 commented 8 years ago

我也遇到了这个问题: Invalid MEX-file. 系统是 window7 x64 matlab2014a 编译用的是VC++2013


Invalid MEX-file 'E:\FLL\faster_rcnn-master\external\caffe\matlab\caffe_fasterrcnn+caffe\private\caffe.mexw64': 找不到指定的模块。

出错 caffe.getnet (line 27) hNet = caffe('get_net', model_file, phase_name);

出错 caffe.Net (line 33) self = caffe.get_net(varargin{:});

出错 script_faster_rcnn_demo (line 34)

rpn_net = caffe.Net(proposal_detection_model.proposal_net_def, 'test');

caffe_.mexw64就在对应目录中。第一次测试的时候可以运行,过了一周再运行就报错了。

ectg commented 7 years ago

I get the following error when I run script_faster_rcnn_demo.m:

Undefined variable "caffe" or class "caffe.Net". Error in script_faster_rcnn_demo (line 34) rpn_net = caffe.Net(proposal_detection_model.proposal_net_def, 'test');

This is a CPU_ONLY build on Windows. I built the Matlab wrapper based on this repo: https://github.com/ShaoqingRen/caffe/tree/faster-R-CNN, and then tried to run the experiments/script_faster_rcnn_demo.m from here: https://github.com/ShaoqingRen/caffe/tree/faster-R-CNN

Can someone please help? Has this been solved yet?

ectg commented 7 years ago

I just figured out why, when I downloaded this repo, the external/caffe folder was empty, which meant functions like Net and set_mode_cpu were missing, I copied over the files from https://github.com/ShaoqingRen/caffe/tree/062f2431162165c658a42d717baf8b74918aa18e and now I can run the demo. :-)