ApolloAuto / apollo

An open autonomous driving platform
Apache License 2.0
25.19k stars 9.71k forks source link

Run yolo_camera_detector_test failed: Segmentation fault (core dumped). How can I fix this? #5973

Closed YafeiWangAlice closed 6 years ago

YafeiWangAlice commented 6 years ago

System information

Steps to reproduce the issue:

Alice@in_dev_docker:/apollo/bazel-bin/modules/perception/obstacle/camera/detector/yolo_camera_detector$ ./yolo_camera_detector_test [NVBLAS] NVBLAS_CONFIG_FILE environment variable is set to '/usr/local/cuda' [NVBLAS] Config parsed [NVBLAS] CPU Blas library need to be provided Running main() from gmock_main.cc [==========] Running 3 tests from 1 test case. [----------] Global test environment set-up. [----------] 3 tests from YoloCameraDetectorTest [ RUN ] YoloCameraDetectorTest.model_init_test WARNING: Logging before InitGoogleLogging() is written to STDERR W1101 10:54:04.100378 23007 yolo_camera_detector.cc:125] YoloCameraDetector options.intrinsic is nullptr. Use default I1101 10:54:04.317409 23007 common.cpp:177] Device id: 0 I1101 10:54:04.317451 23007 common.cpp:178] Major revision number: 6 I1101 10:54:04.317454 23007 common.cpp:179] Minor revision number: 1 I1101 10:54:04.317458 23007 common.cpp:180] Name: GeForce GTX 1080 I1101 10:54:04.317478 23007 common.cpp:181] Total global memory: 8499691520 I1101 10:54:04.317488 23007 common.cpp:182] Total shared memory per block: 49152 I1101 10:54:04.317493 23007 common.cpp:183] Total registers per block: 65536 I1101 10:54:04.317497 23007 common.cpp:184] Warp size: 32 I1101 10:54:04.317519 23007 common.cpp:185] Maximum memory pitch: 2147483647 I1101 10:54:04.317524 23007 common.cpp:186] Maximum threads per block: 1024 I1101 10:54:04.317543 23007 common.cpp:187] Maximum dimension of block: 1024, 1024, 64 I1101 10:54:04.317549 23007 common.cpp:190] Maximum dimension of grid: 2147483647, 65535, 65535 I1101 10:54:04.317569 23007 common.cpp:193] Clock rate: 1809500 I1101 10:54:04.317574 23007 common.cpp:194] Total constant memory: 65536 I1101 10:54:04.317579 23007 common.cpp:195] Texture alignment: 512 I1101 10:54:04.317584 23007 common.cpp:196] Concurrent copy and execution: Yes I1101 10:54:04.317589 23007 common.cpp:198] Number of multiprocessors: 20 I1101 10:54:04.317593 23007 common.cpp:199] Kernel execution timeout: Yes I1101 10:54:04.325739 23007 net.cpp:52] Initializing net from parameters: name: "darknet-16c-16x-3d" state { phase: TEST } layer { name: "input" type: "Input" top: "data" input_param { shape { dim: 1 dim: 384 dim: 960 dim: 3 } } } layer { name: "data_perm" type: "Permute" bottom: "data" top: "data_perm" permute_param { order: 0 order: 3 order: 1 order: 2 } } layer { name: "data_scale" type: "Power" bottom: "data_perm" top: "data_scale" power_param { power: 1 scale: 0.0039215689 shift: 0 } } layer { name: "conv1" type: "Convolution" bottom: "data_scale" top: "conv1" convolution_param { num_output: 16 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv1_relu" type: "ReLU" bottom: "conv1" top: "conv1" } layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv2" type: "Convolution" bottom: "pool1" top: "conv2" convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv2_relu" type: "ReLU" bottom: "conv2" top: "conv2" } layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv3_1" type: "Convolution" bottom: "pool2" top: "conv3_1" convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv3_1_relu" type: "ReLU" bottom: "conv3_1" top: "conv3_1" } layer { name: "conv3_2" type: "Convolution" bottom: "conv3_1" top: "conv3_2" convolution_param { num_output: 32 bias_term: true pad: 0 kernel_size: 1 stride: 1 dilation: 1 } } layer { name: "conv3_2_relu" type: "ReLU" bottom: "conv3_2" top: "conv3_2" } layer { name: "conv3_3" type: "Convolution" bottom: "conv3_2" top: "conv3_3" convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv3_3_relu" type: "ReLU" bottom: "conv3_3" top: "conv3_3" } layer { name: "pool3" type: "Pooling" bottom: "conv3_3" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv4_1" type: "Convolution" bottom: "pool3" top: "conv4_1" convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv4_1_relu" type: "ReLU" bottom: "conv4_1" top: "conv4_1" } layer { name: "conv4_2" type: "Convolution" bottom: "conv4_1" top: "conv4_2" convolution_param { num_output: 64 bias_term: true pad: 0 kernel_size: 1 stride: 1 dilation: 1 } } layer { name: "conv4_2_relu" type: "ReLU" bottom: "conv4_2" top: "conv4_2" } layer { name: "conv4_3" type: "Convolution" bottom: "conv4_2" top: "conv4_3" convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv4_3_relu" type: "ReLU" bottom: "conv4_3" top: "conv4_3" } layer { name: "pool4" type: "Pooling" bottom: "conv4_3" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv5_1" type: "Convolution" bottom: "pool4" top: "conv5_1" convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv5_1_relu" type: "ReLU" bottom: "conv5_1" top: "conv5_1" } layer { name: "conv5_2" type: "Convolution" bottom: "conv5_1" top: "conv5_2" convolution_param { num_output: 128 bias_term: true pad: 0 kernel_size: 1 stride: 1 dilation: 1 } } layer { name: "conv5_2_relu" type: "ReLU" bottom: "conv5_2" top: "conv5_2" } layer { name: "conv5_3" type: "Convolution" bottom: "conv5_2" top: "conv5_3" convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv5_3_relu" type: "ReLU" bottom: "conv5_3" top: "conv5_3" } layer { name: "conv5_4" type: "Convolution" bottom: "conv5_3" top: "conv5_4" convolution_param { num_output: 128 bias_term: true pad: 0 kernel_size: 1 stride: 1 dilation: 1 } } layer { name: "conv5_4_relu" type: "ReLU" bottom: "conv5_4" top: "conv5_4" } layer { name: "conv5_5" type: "Convolution" bottom: "conv5_4" top: "conv5_5" convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv5_5_relu" type: "ReLU" bottom: "conv5_5" top: "conv5_5" } layer { name: "pool5" type: "Pooling" bottom: "conv5_5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 1 pad: 1 } } layer { name: "conv6_1_nodilate" type: "Convolution" bottom: "pool5" top: "conv6_1" convolution_param { num_output: 512 bias_term: true pad: 2 kernel_size: 5 stride: 1 dilation: 1 } } layer { name: "conv6_1_relu" type: "ReLU" bottom: "conv6_1" top: "conv6_1" } layer { name: "conv6_2" type: "Convolution" bottom: "conv6_1" top: "conv6_2" convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 stride: 1 dilation: 1 } } layer { name: "conv6_2_relu" type: "ReLU" bottom: "conv6_2" top: "conv6_2" } layer { name: "conv6_3" type: "Convolution" bottom: "conv6_2" top: "conv6_3" convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv6_3_relu" type: "ReLU" bottom: "conv6_3" top: "conv6_3" } layer { name: "conv6_4" type: "Convolution" bottom: "conv6_3" top: "conv6_4" convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 stride: 1 dilation: 1 } } layer { name: "conv6_4_relu" type: "ReLU" bottom: "conv6_4" top: "conv6_4" } layer { name: "conv6_5" type: "Convolution" bottom: "conv6_4" top: "conv6_5" convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv6_5_relu" type: "ReLU" bottom: "conv6_5" top: "conv6_5" } layer { name: "conv7_1" type: "Convolution" bottom: "conv6_5" top: "conv7_1" convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv7_1_relu" type: "ReLU" bottom: "conv7_1" top: "conv7_1" } layer { name: "conv7_2" type: "Convolution" bottom: "conv7_1" top: "conv7_2" convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv7_2_relu" type: "ReLU" bottom: "conv7_2" top: "conv7_2" } layer { name: "concat8" type: "Concat" bottom: "conv5_5" bottom: "conv7_2" top: "concat8" concat_param { axis: 1 } } layer { name: "conv9" type: "Convolution" bottom: "concat8" top: "conv9" convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 stride: 1 dilation: 1 } } layer { name: "conv9_relu" type: "ReLU" bottom: "conv9" top: "conv9" } layer { name: "conv_final" type: "Convolution" bottom: "conv9" top: "conv_final" convolution_param { num_output: 144 bias_term: true pad: 0 kernel_size: 1 stride: 1 } } layer { name: "conv_final_permute" type: "Permute" bottom: "conv_final" top: "conv_final_permute" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "slice" type: "Slice" bottom: "conv_final_permute" top: "loc_pred" top: "obj_perm" top: "cls_perm" slice_param { slice_point: 64 slice_point: 80 axis: 3 } } layer { name: "cls_reshape" type: "Reshape" bottom: "cls_perm" top: "cls_reshape" reshape_param { shape { dim: 0 dim: 0 dim: -1 dim: 4 } } } layer { name: "cls_pred_prob" type: "Softmax" bottom: "cls_reshape" top: "cls_pred_prob" softmax_param { axis: 3 } } layer { name: "cls_pred" type: "Reshape" bottom: "cls_pred_prob" top: "cls_pred" reshape_param { shape { dim: 0 dim: 0 dim: -1 dim: 64 } } } layer { name: "obj_pred" type: "Sigmoid" bottom: "obj_perm" top: "obj_pred" } layer { name: "ori_origin" type: "Convolution" bottom: "conv9" top: "ori_origin" convolution_param { num_output: 32 bias_term: true pad: 0 kernel_size: 1 stride: 1 } } layer { name: "ori_pred" type: "Permute" bottom: "ori_origin" top: "ori_pred" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "dim_origin" type: "Convolution" bottom: "conv9" top: "dim_origin" convolution_param { num_output: 48 bias_term: true pad: 0 kernel_size: 1 stride: 1 } } layer { name: "dim_pred" type: "Permute" bottom: "dim_origin" top: "dim_pred" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "lof_origin" type: "Convolution" bottom: "conv9" top: "lof_origin" propagate_down: false convolution_param { num_output: 64 bias_term: true pad: 0 kernel_size: 1 stride: 1 } } layer { name: "lof_perm" type: "Permute" bottom: "lof_origin" top: "lof_pred" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "lor_origin" type: "Convolution" bottom: "conv9" top: "lor_origin" propagate_down: false convolution_param { num_output: 64 bias_term: true pad: 0 kernel_size: 1 stride: 1 } } layer { name: "lor_perm" type: "Permute" bottom: "lor_origin" top: "lor_pred" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "reduce1_lane" type: "Convolution" bottom: "concat8" top: "reduce1_lane" convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reduce1_lane_relu" type: "ReLU" bottom: "reduce1_lane" top: "reduce1_lane" } layer { name: "deconv1_lane" type: "Deconvolution" bottom: "reduce1_lane" top: "deconv1_lane" convolution_param { num_output: 64 bias_term: true pad: 0 kernel_size: 2 stride: 2 } } layer { name: "deconv1_lane_relu" type: "ReLU" bottom: "deconv1_lane" top: "deconv1_lane" } layer { name: "reorg4" type: "Convolution" bottom: "conv4_3" top: "reorg4" convolution_param { num_output: 64 bias_term: false pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reorg4_relu" type: "ReLU" bottom: "reorg4" top: "reorg4" } layer { name: "concat4" type: "Concat" bottom: "reorg4" bottom: "deconv1_lane" top: "concat4" concat_param { axis: 1 } } layer { name: "reduce2_lane" type: "Convolution" bottom: "concat4" top: "reduce2_lane" convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reduce2_lane_relu" type: "ReLU" bottom: "reduce2_lane" top: "reduce2_lane" } layer { name: "deconv2_lane" type: "Deconvolution" bottom: "reduce2_lane" top: "deconv2_lane" convolution_param { num_output: 32 bias_term: true pad: 0 kernel_size: 2 stride: 2 } } layer { name: "deconv2_lane_relu" type: "ReLU" bottom: "deconv2_lane" top: "deconv2_lane" } layer { name: "reorg3" type: "Convolution" bottom: "conv3_3" top: "reorg3" convolution_param { num_output: 32 bias_term: false pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reorg3_relu" type: "ReLU" bottom: "reorg3" top: "reorg3" } layer { name: "concat3" type: "Concat" bottom: "reorg3" bottom: "deconv2_lane" top: "concat3" concat_param { axis: 1 } } layer { name: "reduce3_lane" type: "Convolution" bottom: "concat3" top: "reduce3_lane" convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reduce3_lane_relu" type: "ReLU" bottom: "reduce3_lane" top: "reduce3_lane" } layer { name: "deconv3_lane" type: "Deconvolution" bottom: "reduce3_lane" top: "deconv3_lane" convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 2 stride: 2 } } layer { name: "deconv3_lane_relu" type: "ReLU" bottom: "deconv3_lane" top: "deconv3_lane" } layer { name: "reorg2" type: "Convolution" bottom: "conv2" top: "reorg2" convolution_param { num_output: 16 bias_term: false pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reorg2_relu" type: "ReLU" bottom: "reorg2" top: "reorg2" } layer { name: "concat2" type: "Concat" bottom: "reorg2" bottom: "deconv3_lane" top: "concat2" concat_param { axis: 1 } } layer { name: "reduce4_lane" type: "Convolution" bottom: "concat2" top: "reduce4_lane" convolution_param { num_output: 16 bias_term: true pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reduce4_lane_relu" type: "ReLU" bottom: "reduce4_lane" top: "reduce4_lane" } layer { name: "deconv4_lane" type: "Deconvolution" bottom: "reduce4_lane" top: "deconv4_lane" convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 2 stride: 2 } } layer { name: "deconv4_lane_relu" type: "ReLU" bottom: "deconv4_lane" top: "deconv4_lane" } layer { name: "reorg1" type: "Convolution" bottom: "conv1" top: "reorg1" convolution_param { num_output: 8 bias_term: false pad: 1 kernel_size: 3 stride: 1 } } layer { name: "reorg1_relu" type: "ReLU" bottom: "reorg1" top: "reorg1" } layer { name: "concat1" type: "Concat" bottom: "reorg1" bottom: "deconv4_lane" top: "concat1" concat_param { axis: 1 } } layer { name: "conv_out" type: "Convolution" bottom: "concat1" top: "conv_out" convolution_param { num_output: 4 bias_term: false pad: 1 kernel_size: 3 stride: 1 } } layer { name: "seg_prob" type: "Softmax" bottom: "conv_out" top: "seg_prob" softmax_param { axis: 1 } } I1101 10:54:04.328699 23007 net.cpp:94] Creating Layer input I1101 10:54:04.328732 23007 net.cpp:402] input -> data I1101 10:54:04.339604 23007 net.cpp:144] Setting up input I1101 10:54:04.339648 23007 net.cpp:151] Top shape: 1 384 960 3 (1105920) I1101 10:54:04.339655 23007 net.cpp:159] Memory required for data: 4423680 I1101 10:54:04.339751 23007 net.cpp:94] Creating Layer data_perm I1101 10:54:04.339848 23007 net.cpp:428] data_perm <- data I1101 10:54:04.339888 23007 net.cpp:402] data_perm -> data_perm I1101 10:54:04.340045 23007 net.cpp:144] Setting up data_perm I1101 10:54:04.340057 23007 net.cpp:151] Top shape: 1 3 384 960 (1105920) I1101 10:54:04.340065 23007 net.cpp:159] Memory required for data: 8847360 I1101 10:54:04.340111 23007 net.cpp:94] Creating Layer data_scale I1101 10:54:04.340118 23007 net.cpp:428] data_scale <- data_perm I1101 10:54:04.340128 23007 net.cpp:402] data_scale -> data_scale I1101 10:54:04.340160 23007 net.cpp:144] Setting up data_scale I1101 10:54:04.340185 23007 net.cpp:151] Top shape: 1 3 384 960 (1105920) I1101 10:54:04.340191 23007 net.cpp:159] Memory required for data: 13271040 I1101 10:54:04.340217 23007 net.cpp:94] Creating Layer conv1 I1101 10:54:04.340239 23007 net.cpp:428] conv1 <- data_scale I1101 10:54:04.340247 23007 net.cpp:402] conv1 -> conv1 I1101 10:54:04.716864 23007 net.cpp:144] Setting up conv1 I1101 10:54:04.716895 23007 net.cpp:151] Top shape: 1 16 384 960 (5898240) I1101 10:54:04.716917 23007 net.cpp:159] Memory required for data: 36864000 I1101 10:54:04.717072 23007 net.cpp:94] Creating Layer conv1_relu I1101 10:54:04.717097 23007 net.cpp:428] conv1_relu <- conv1 I1101 10:54:04.717109 23007 net.cpp:389] conv1_relu -> conv1 (in-place) I1101 10:54:04.717661 23007 net.cpp:144] Setting up conv1_relu I1101 10:54:04.717687 23007 net.cpp:151] Top shape: 1 16 384 960 (5898240) I1101 10:54:04.717706 23007 net.cpp:159] Memory required for data: 60456960 I1101 10:54:04.717733 23007 net.cpp:94] Creating Layer conv1_conv1_relu_0_split I1101 10:54:04.717741 23007 net.cpp:428] conv1_conv1_relu_0_split <- conv1 I1101 10:54:04.717763 23007 net.cpp:402] conv1_conv1_relu_0_split -> conv1_conv1_relu_0_split_0 I1101 10:54:04.717784 23007 net.cpp:402] conv1_conv1_relu_0_split -> conv1_conv1_relu_0_split_1 I1101 10:54:04.717833 23007 net.cpp:144] Setting up conv1_conv1_relu_0_split I1101 10:54:04.717845 23007 net.cpp:151] Top shape: 1 16 384 960 (5898240) I1101 10:54:04.717852 23007 net.cpp:151] Top shape: 1 16 384 960 (5898240) I1101 10:54:04.717857 23007 net.cpp:159] Memory required for data: 107642880 I1101 10:54:04.717871 23007 net.cpp:94] Creating Layer pool1 I1101 10:54:04.717877 23007 net.cpp:428] pool1 <- conv1_conv1_relu_0_split_0 I1101 10:54:04.717932 23007 net.cpp:402] pool1 -> pool1 I1101 10:54:04.717990 23007 net.cpp:144] Setting up pool1 I1101 10:54:04.718001 23007 net.cpp:151] Top shape: 1 16 192 480 (1474560) I1101 10:54:04.718006 23007 net.cpp:159] Memory required for data: 113541120 I1101 10:54:04.718019 23007 net.cpp:94] Creating Layer conv2 I1101 10:54:04.718026 23007 net.cpp:428] conv2 <- pool1 I1101 10:54:04.718051 23007 net.cpp:402] conv2 -> conv2 I1101 10:54:04.719977 23007 net.cpp:144] Setting up conv2 I1101 10:54:04.719995 23007 net.cpp:151] Top shape: 1 32 192 480 (2949120) I1101 10:54:04.720001 23007 net.cpp:159] Memory required for data: 125337600 I1101 10:54:04.720024 23007 net.cpp:94] Creating Layer conv2_relu I1101 10:54:04.720031 23007 net.cpp:428] conv2_relu <- conv2 I1101 10:54:04.720039 23007 net.cpp:389] conv2_relu -> conv2 (in-place) I1101 10:54:04.720252 23007 net.cpp:144] Setting up conv2_relu I1101 10:54:04.720263 23007 net.cpp:151] Top shape: 1 32 192 480 (2949120) I1101 10:54:04.720268 23007 net.cpp:159] Memory required for data: 137134080 I1101 10:54:04.720278 23007 net.cpp:94] Creating Layer conv2_conv2_relu_0_split I1101 10:54:04.720283 23007 net.cpp:428] conv2_conv2_relu_0_split <- conv2 I1101 10:54:04.720293 23007 net.cpp:402] conv2_conv2_relu_0_split -> conv2_conv2_relu_0_split_0 I1101 10:54:04.720304 23007 net.cpp:402] conv2_conv2_relu_0_split -> conv2_conv2_relu_0_split_1 I1101 10:54:04.720355 23007 net.cpp:144] Setting up conv2_conv2_relu_0_split I1101 10:54:04.720378 23007 net.cpp:151] Top shape: 1 32 192 480 (2949120) I1101 10:54:04.720386 23007 net.cpp:151] Top shape: 1 32 192 480 (2949120) I1101 10:54:04.720391 23007 net.cpp:159] Memory required for data: 160727040 I1101 10:54:04.720399 23007 net.cpp:94] Creating Layer pool2 I1101 10:54:04.720407 23007 net.cpp:428] pool2 <- conv2_conv2_relu_0_split_0 I1101 10:54:04.720428 23007 net.cpp:402] pool2 -> pool2 I1101 10:54:04.720463 23007 net.cpp:144] Setting up pool2 I1101 10:54:04.720472 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.720479 23007 net.cpp:159] Memory required for data: 163676160 I1101 10:54:04.720489 23007 net.cpp:94] Creating Layer conv3_1 I1101 10:54:04.720494 23007 net.cpp:428] conv3_1 <- pool2 I1101 10:54:04.720520 23007 net.cpp:402] conv3_1 -> conv3_1 I1101 10:54:04.722091 23007 net.cpp:144] Setting up conv3_1 I1101 10:54:04.722107 23007 net.cpp:151] Top shape: 1 64 96 240 (1474560) I1101 10:54:04.722113 23007 net.cpp:159] Memory required for data: 169574400 I1101 10:54:04.722147 23007 net.cpp:94] Creating Layer conv3_1_relu I1101 10:54:04.722154 23007 net.cpp:428] conv3_1_relu <- conv3_1 I1101 10:54:04.722162 23007 net.cpp:389] conv3_1_relu -> conv3_1 (in-place) I1101 10:54:04.722362 23007 net.cpp:144] Setting up conv3_1_relu I1101 10:54:04.722400 23007 net.cpp:151] Top shape: 1 64 96 240 (1474560) I1101 10:54:04.722406 23007 net.cpp:159] Memory required for data: 175472640 I1101 10:54:04.722417 23007 net.cpp:94] Creating Layer conv3_2 I1101 10:54:04.722424 23007 net.cpp:428] conv3_2 <- conv3_1 I1101 10:54:04.722432 23007 net.cpp:402] conv3_2 -> conv3_2 I1101 10:54:04.723989 23007 net.cpp:144] Setting up conv3_2 I1101 10:54:04.724004 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.724010 23007 net.cpp:159] Memory required for data: 178421760 I1101 10:54:04.724023 23007 net.cpp:94] Creating Layer conv3_2_relu I1101 10:54:04.724030 23007 net.cpp:428] conv3_2_relu <- conv3_2 I1101 10:54:04.724038 23007 net.cpp:389] conv3_2_relu -> conv3_2 (in-place) I1101 10:54:04.724551 23007 net.cpp:144] Setting up conv3_2_relu I1101 10:54:04.724565 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.724584 23007 net.cpp:159] Memory required for data: 181370880 I1101 10:54:04.724611 23007 net.cpp:94] Creating Layer conv3_3 I1101 10:54:04.724617 23007 net.cpp:428] conv3_3 <- conv3_2 I1101 10:54:04.724640 23007 net.cpp:402] conv3_3 -> conv3_3 I1101 10:54:04.725731 23007 net.cpp:144] Setting up conv3_3 I1101 10:54:04.725747 23007 net.cpp:151] Top shape: 1 64 96 240 (1474560) I1101 10:54:04.725754 23007 net.cpp:159] Memory required for data: 187269120 I1101 10:54:04.725775 23007 net.cpp:94] Creating Layer conv3_3_relu I1101 10:54:04.725782 23007 net.cpp:428] conv3_3_relu <- conv3_3 I1101 10:54:04.725790 23007 net.cpp:389] conv3_3_relu -> conv3_3 (in-place) I1101 10:54:04.726012 23007 net.cpp:144] Setting up conv3_3_relu I1101 10:54:04.726024 23007 net.cpp:151] Top shape: 1 64 96 240 (1474560) I1101 10:54:04.726030 23007 net.cpp:159] Memory required for data: 193167360 I1101 10:54:04.726039 23007 net.cpp:94] Creating Layer conv3_3_conv3_3_relu_0_split I1101 10:54:04.726047 23007 net.cpp:428] conv3_3_conv3_3_relu_0_split <- conv3_3 I1101 10:54:04.726068 23007 net.cpp:402] conv3_3_conv3_3_relu_0_split -> conv3_3_conv3_3_relu_0_split_0 I1101 10:54:04.726090 23007 net.cpp:402] conv3_3_conv3_3_relu_0_split -> conv3_3_conv3_3_relu_0_split_1 I1101 10:54:04.726130 23007 net.cpp:144] Setting up conv3_3_conv3_3_relu_0_split I1101 10:54:04.726140 23007 net.cpp:151] Top shape: 1 64 96 240 (1474560) I1101 10:54:04.726147 23007 net.cpp:151] Top shape: 1 64 96 240 (1474560) I1101 10:54:04.726166 23007 net.cpp:159] Memory required for data: 204963840 I1101 10:54:04.726174 23007 net.cpp:94] Creating Layer pool3 I1101 10:54:04.726182 23007 net.cpp:428] pool3 <- conv3_3_conv3_3_relu_0_split_0 I1101 10:54:04.726192 23007 net.cpp:402] pool3 -> pool3 I1101 10:54:04.726229 23007 net.cpp:144] Setting up pool3 I1101 10:54:04.726239 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.726258 23007 net.cpp:159] Memory required for data: 206438400 I1101 10:54:04.726271 23007 net.cpp:94] Creating Layer conv4_1 I1101 10:54:04.726277 23007 net.cpp:428] conv4_1 <- pool3 I1101 10:54:04.726287 23007 net.cpp:402] conv4_1 -> conv4_1 I1101 10:54:04.728085 23007 net.cpp:144] Setting up conv4_1 I1101 10:54:04.728102 23007 net.cpp:151] Top shape: 1 128 48 120 (737280) I1101 10:54:04.728109 23007 net.cpp:159] Memory required for data: 209387520 I1101 10:54:04.728123 23007 net.cpp:94] Creating Layer conv4_1_relu I1101 10:54:04.728132 23007 net.cpp:428] conv4_1_relu <- conv4_1 I1101 10:54:04.728140 23007 net.cpp:389] conv4_1_relu -> conv4_1 (in-place) I1101 10:54:04.728334 23007 net.cpp:144] Setting up conv4_1_relu I1101 10:54:04.728346 23007 net.cpp:151] Top shape: 1 128 48 120 (737280) I1101 10:54:04.728351 23007 net.cpp:159] Memory required for data: 212336640 I1101 10:54:04.728363 23007 net.cpp:94] Creating Layer conv4_2 I1101 10:54:04.728369 23007 net.cpp:428] conv4_2 <- conv4_1 I1101 10:54:04.728379 23007 net.cpp:402] conv4_2 -> conv4_2 I1101 10:54:04.729379 23007 net.cpp:144] Setting up conv4_2 I1101 10:54:04.729394 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.729400 23007 net.cpp:159] Memory required for data: 213811200 I1101 10:54:04.729414 23007 net.cpp:94] Creating Layer conv4_2_relu I1101 10:54:04.729421 23007 net.cpp:428] conv4_2_relu <- conv4_2 I1101 10:54:04.729430 23007 net.cpp:389] conv4_2_relu -> conv4_2 (in-place) I1101 10:54:04.729892 23007 net.cpp:144] Setting up conv4_2_relu I1101 10:54:04.729905 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.729910 23007 net.cpp:159] Memory required for data: 215285760 I1101 10:54:04.729923 23007 net.cpp:94] Creating Layer conv4_3 I1101 10:54:04.729929 23007 net.cpp:428] conv4_3 <- conv4_2 I1101 10:54:04.729939 23007 net.cpp:402] conv4_3 -> conv4_3 I1101 10:54:04.731241 23007 net.cpp:144] Setting up conv4_3 I1101 10:54:04.731258 23007 net.cpp:151] Top shape: 1 128 48 120 (737280) I1101 10:54:04.731264 23007 net.cpp:159] Memory required for data: 218234880 I1101 10:54:04.731278 23007 net.cpp:94] Creating Layer conv4_3_relu I1101 10:54:04.731286 23007 net.cpp:428] conv4_3_relu <- conv4_3 I1101 10:54:04.731295 23007 net.cpp:389] conv4_3_relu -> conv4_3 (in-place) I1101 10:54:04.731555 23007 net.cpp:144] Setting up conv4_3_relu I1101 10:54:04.731568 23007 net.cpp:151] Top shape: 1 128 48 120 (737280) I1101 10:54:04.731587 23007 net.cpp:159] Memory required for data: 221184000 I1101 10:54:04.731595 23007 net.cpp:94] Creating Layer conv4_3_conv4_3_relu_0_split I1101 10:54:04.731616 23007 net.cpp:428] conv4_3_conv4_3_relu_0_split <- conv4_3 I1101 10:54:04.731626 23007 net.cpp:402] conv4_3_conv4_3_relu_0_split -> conv4_3_conv4_3_relu_0_split_0 I1101 10:54:04.731636 23007 net.cpp:402] conv4_3_conv4_3_relu_0_split -> conv4_3_conv4_3_relu_0_split_1 I1101 10:54:04.731684 23007 net.cpp:144] Setting up conv4_3_conv4_3_relu_0_split I1101 10:54:04.731695 23007 net.cpp:151] Top shape: 1 128 48 120 (737280) I1101 10:54:04.731703 23007 net.cpp:151] Top shape: 1 128 48 120 (737280) I1101 10:54:04.731711 23007 net.cpp:159] Memory required for data: 227082240 I1101 10:54:04.731720 23007 net.cpp:94] Creating Layer pool4 I1101 10:54:04.731725 23007 net.cpp:428] pool4 <- conv4_3_conv4_3_relu_0_split_0 I1101 10:54:04.731748 23007 net.cpp:402] pool4 -> pool4 I1101 10:54:04.731812 23007 net.cpp:144] Setting up pool4 I1101 10:54:04.731822 23007 net.cpp:151] Top shape: 1 128 24 60 (184320) I1101 10:54:04.731827 23007 net.cpp:159] Memory required for data: 227819520 I1101 10:54:04.731853 23007 net.cpp:94] Creating Layer conv5_1 I1101 10:54:04.731858 23007 net.cpp:428] conv5_1 <- pool4 I1101 10:54:04.731868 23007 net.cpp:402] conv5_1 -> conv5_1 I1101 10:54:04.733287 23007 net.cpp:144] Setting up conv5_1 I1101 10:54:04.733302 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.733322 23007 net.cpp:159] Memory required for data: 229294080 I1101 10:54:04.733357 23007 net.cpp:94] Creating Layer conv5_1_relu I1101 10:54:04.733366 23007 net.cpp:428] conv5_1_relu <- conv5_1 I1101 10:54:04.733373 23007 net.cpp:389] conv5_1_relu -> conv5_1 (in-place) I1101 10:54:04.733598 23007 net.cpp:144] Setting up conv5_1_relu I1101 10:54:04.733610 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.733616 23007 net.cpp:159] Memory required for data: 230768640 I1101 10:54:04.733641 23007 net.cpp:94] Creating Layer conv5_2 I1101 10:54:04.733647 23007 net.cpp:428] conv5_2 <- conv5_1 I1101 10:54:04.733671 23007 net.cpp:402] conv5_2 -> conv5_2 I1101 10:54:04.734689 23007 net.cpp:144] Setting up conv5_2 I1101 10:54:04.734704 23007 net.cpp:151] Top shape: 1 128 24 60 (184320) I1101 10:54:04.734724 23007 net.cpp:159] Memory required for data: 231505920 I1101 10:54:04.734738 23007 net.cpp:94] Creating Layer conv5_2_relu I1101 10:54:04.734745 23007 net.cpp:428] conv5_2_relu <- conv5_2 I1101 10:54:04.734753 23007 net.cpp:389] conv5_2_relu -> conv5_2 (in-place) I1101 10:54:04.734997 23007 net.cpp:144] Setting up conv5_2_relu I1101 10:54:04.735008 23007 net.cpp:151] Top shape: 1 128 24 60 (184320) I1101 10:54:04.735013 23007 net.cpp:159] Memory required for data: 232243200 I1101 10:54:04.735039 23007 net.cpp:94] Creating Layer conv5_3 I1101 10:54:04.735046 23007 net.cpp:428] conv5_3 <- conv5_2 I1101 10:54:04.735069 23007 net.cpp:402] conv5_3 -> conv5_3 I1101 10:54:04.737016 23007 net.cpp:144] Setting up conv5_3 I1101 10:54:04.737033 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.737040 23007 net.cpp:159] Memory required for data: 233717760 I1101 10:54:04.737058 23007 net.cpp:94] Creating Layer conv5_3_relu I1101 10:54:04.737066 23007 net.cpp:428] conv5_3_relu <- conv5_3 I1101 10:54:04.737088 23007 net.cpp:389] conv5_3_relu -> conv5_3 (in-place) I1101 10:54:04.737637 23007 net.cpp:144] Setting up conv5_3_relu I1101 10:54:04.737665 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.737671 23007 net.cpp:159] Memory required for data: 235192320 I1101 10:54:04.737695 23007 net.cpp:94] Creating Layer conv5_4 I1101 10:54:04.737701 23007 net.cpp:428] conv5_4 <- conv5_3 I1101 10:54:04.737712 23007 net.cpp:402] conv5_4 -> conv5_4 I1101 10:54:04.738749 23007 net.cpp:144] Setting up conv5_4 I1101 10:54:04.738765 23007 net.cpp:151] Top shape: 1 128 24 60 (184320) I1101 10:54:04.738785 23007 net.cpp:159] Memory required for data: 235929600 I1101 10:54:04.738811 23007 net.cpp:94] Creating Layer conv5_4_relu I1101 10:54:04.738817 23007 net.cpp:428] conv5_4_relu <- conv5_4 I1101 10:54:04.738827 23007 net.cpp:389] conv5_4_relu -> conv5_4 (in-place) I1101 10:54:04.739017 23007 net.cpp:144] Setting up conv5_4_relu I1101 10:54:04.739028 23007 net.cpp:151] Top shape: 1 128 24 60 (184320) I1101 10:54:04.739046 23007 net.cpp:159] Memory required for data: 236666880 I1101 10:54:04.739058 23007 net.cpp:94] Creating Layer conv5_5 I1101 10:54:04.739064 23007 net.cpp:428] conv5_5 <- conv5_4 I1101 10:54:04.739075 23007 net.cpp:402] conv5_5 -> conv5_5 I1101 10:54:04.741005 23007 net.cpp:144] Setting up conv5_5 I1101 10:54:04.741022 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.741029 23007 net.cpp:159] Memory required for data: 238141440 I1101 10:54:04.741042 23007 net.cpp:94] Creating Layer conv5_5_relu I1101 10:54:04.741050 23007 net.cpp:428] conv5_5_relu <- conv5_5 I1101 10:54:04.741072 23007 net.cpp:389] conv5_5_relu -> conv5_5 (in-place) I1101 10:54:04.741266 23007 net.cpp:144] Setting up conv5_5_relu I1101 10:54:04.741277 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.741283 23007 net.cpp:159] Memory required for data: 239616000 I1101 10:54:04.741293 23007 net.cpp:94] Creating Layer conv5_5_conv5_5_relu_0_split I1101 10:54:04.741299 23007 net.cpp:428] conv5_5_conv5_5_relu_0_split <- conv5_5 I1101 10:54:04.741308 23007 net.cpp:402] conv5_5_conv5_5_relu_0_split -> conv5_5_conv5_5_relu_0_split_0 I1101 10:54:04.741317 23007 net.cpp:402] conv5_5_conv5_5_relu_0_split -> conv5_5_conv5_5_relu_0_split_1 I1101 10:54:04.741397 23007 net.cpp:144] Setting up conv5_5_conv5_5_relu_0_split I1101 10:54:04.741408 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.741415 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.741420 23007 net.cpp:159] Memory required for data: 242565120 I1101 10:54:04.741443 23007 net.cpp:94] Creating Layer pool5 I1101 10:54:04.741449 23007 net.cpp:428] pool5 <- conv5_5_conv5_5_relu_0_split_0 I1101 10:54:04.741459 23007 net.cpp:402] pool5 -> pool5 I1101 10:54:04.741494 23007 net.cpp:144] Setting up pool5 I1101 10:54:04.741516 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.741523 23007 net.cpp:159] Memory required for data: 244039680 I1101 10:54:04.741536 23007 net.cpp:94] Creating Layer conv6_1_nodilate I1101 10:54:04.741542 23007 net.cpp:428] conv6_1_nodilate <- pool5 I1101 10:54:04.741551 23007 net.cpp:402] conv6_1_nodilate -> conv6_1 I1101 10:54:04.746732 23007 net.cpp:144] Setting up conv6_1_nodilate I1101 10:54:04.746779 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.746785 23007 net.cpp:159] Memory required for data: 246988800 I1101 10:54:04.746824 23007 net.cpp:94] Creating Layer conv6_1_relu I1101 10:54:04.746831 23007 net.cpp:428] conv6_1_relu <- conv6_1 I1101 10:54:04.746855 23007 net.cpp:389] conv6_1_relu -> conv6_1 (in-place) I1101 10:54:04.747437 23007 net.cpp:144] Setting up conv6_1_relu I1101 10:54:04.747452 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.747457 23007 net.cpp:159] Memory required for data: 249937920 I1101 10:54:04.747486 23007 net.cpp:94] Creating Layer conv6_2 I1101 10:54:04.747493 23007 net.cpp:428] conv6_2 <- conv6_1 I1101 10:54:04.747530 23007 net.cpp:402] conv6_2 -> conv6_2 I1101 10:54:04.748713 23007 net.cpp:144] Setting up conv6_2 I1101 10:54:04.748742 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.748762 23007 net.cpp:159] Memory required for data: 251412480 I1101 10:54:04.748777 23007 net.cpp:94] Creating Layer conv6_2_relu I1101 10:54:04.748785 23007 net.cpp:428] conv6_2_relu <- conv6_2 I1101 10:54:04.748807 23007 net.cpp:389] conv6_2_relu -> conv6_2 (in-place) I1101 10:54:04.749403 23007 net.cpp:144] Setting up conv6_2_relu I1101 10:54:04.749416 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.749423 23007 net.cpp:159] Memory required for data: 252887040 I1101 10:54:04.749449 23007 net.cpp:94] Creating Layer conv6_3 I1101 10:54:04.749456 23007 net.cpp:428] conv6_3 <- conv6_2 I1101 10:54:04.749467 23007 net.cpp:402] conv6_3 -> conv6_3 I1101 10:54:04.752413 23007 net.cpp:144] Setting up conv6_3 I1101 10:54:04.752437 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.752444 23007 net.cpp:159] Memory required for data: 255836160 I1101 10:54:04.752480 23007 net.cpp:94] Creating Layer conv6_3_relu I1101 10:54:04.752487 23007 net.cpp:428] conv6_3_relu <- conv6_3 I1101 10:54:04.752514 23007 net.cpp:389] conv6_3_relu -> conv6_3 (in-place) I1101 10:54:04.752797 23007 net.cpp:144] Setting up conv6_3_relu I1101 10:54:04.752810 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.752815 23007 net.cpp:159] Memory required for data: 258785280 I1101 10:54:04.752842 23007 net.cpp:94] Creating Layer conv6_4 I1101 10:54:04.752848 23007 net.cpp:428] conv6_4 <- conv6_3 I1101 10:54:04.752858 23007 net.cpp:402] conv6_4 -> conv6_4 I1101 10:54:04.754022 23007 net.cpp:144] Setting up conv6_4 I1101 10:54:04.754039 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.754045 23007 net.cpp:159] Memory required for data: 260259840 I1101 10:54:04.754083 23007 net.cpp:94] Creating Layer conv6_4_relu I1101 10:54:04.754103 23007 net.cpp:428] conv6_4_relu <- conv6_4 I1101 10:54:04.754112 23007 net.cpp:389] conv6_4_relu -> conv6_4 (in-place) I1101 10:54:04.754637 23007 net.cpp:144] Setting up conv6_4_relu I1101 10:54:04.754652 23007 net.cpp:151] Top shape: 1 256 24 60 (368640) I1101 10:54:04.754657 23007 net.cpp:159] Memory required for data: 261734400 I1101 10:54:04.754684 23007 net.cpp:94] Creating Layer conv6_5 I1101 10:54:04.754690 23007 net.cpp:428] conv6_5 <- conv6_4 I1101 10:54:04.754703 23007 net.cpp:402] conv6_5 -> conv6_5 I1101 10:54:04.757537 23007 net.cpp:144] Setting up conv6_5 I1101 10:54:04.757565 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.757570 23007 net.cpp:159] Memory required for data: 264683520 I1101 10:54:04.757618 23007 net.cpp:94] Creating Layer conv6_5_relu I1101 10:54:04.757627 23007 net.cpp:428] conv6_5_relu <- conv6_5 I1101 10:54:04.757635 23007 net.cpp:389] conv6_5_relu -> conv6_5 (in-place) I1101 10:54:04.757917 23007 net.cpp:144] Setting up conv6_5_relu I1101 10:54:04.757930 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.757936 23007 net.cpp:159] Memory required for data: 267632640 I1101 10:54:04.757948 23007 net.cpp:94] Creating Layer conv7_1 I1101 10:54:04.757956 23007 net.cpp:428] conv7_1 <- conv6_5 I1101 10:54:04.757964 23007 net.cpp:402] conv7_1 -> conv7_1 I1101 10:54:04.762136 23007 net.cpp:144] Setting up conv7_1 I1101 10:54:04.762167 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.762173 23007 net.cpp:159] Memory required for data: 270581760 I1101 10:54:04.762225 23007 net.cpp:94] Creating Layer conv7_1_relu I1101 10:54:04.762234 23007 net.cpp:428] conv7_1_relu <- conv7_1 I1101 10:54:04.762246 23007 net.cpp:389] conv7_1_relu -> conv7_1 (in-place) I1101 10:54:04.762527 23007 net.cpp:144] Setting up conv7_1_relu I1101 10:54:04.762539 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.762544 23007 net.cpp:159] Memory required for data: 273530880 I1101 10:54:04.762573 23007 net.cpp:94] Creating Layer conv7_2 I1101 10:54:04.762579 23007 net.cpp:428] conv7_2 <- conv7_1 I1101 10:54:04.762590 23007 net.cpp:402] conv7_2 -> conv7_2 I1101 10:54:04.766372 23007 net.cpp:144] Setting up conv7_2 I1101 10:54:04.766404 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.766410 23007 net.cpp:159] Memory required for data: 276480000 I1101 10:54:04.766448 23007 net.cpp:94] Creating Layer conv7_2_relu I1101 10:54:04.766455 23007 net.cpp:428] conv7_2_relu <- conv7_2 I1101 10:54:04.766482 23007 net.cpp:389] conv7_2_relu -> conv7_2 (in-place) I1101 10:54:04.766749 23007 net.cpp:144] Setting up conv7_2_relu I1101 10:54:04.766774 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.766779 23007 net.cpp:159] Memory required for data: 279429120 I1101 10:54:04.766849 23007 net.cpp:94] Creating Layer concat8 I1101 10:54:04.766856 23007 net.cpp:428] concat8 <- conv5_5_conv5_5_relu_0_split_1 I1101 10:54:04.766880 23007 net.cpp:428] concat8 <- conv7_2 I1101 10:54:04.766903 23007 net.cpp:402] concat8 -> concat8 I1101 10:54:04.766988 23007 net.cpp:144] Setting up concat8 I1101 10:54:04.767010 23007 net.cpp:151] Top shape: 1 768 24 60 (1105920) I1101 10:54:04.767019 23007 net.cpp:159] Memory required for data: 283852800 I1101 10:54:04.767055 23007 net.cpp:94] Creating Layer concat8_concat8_0_split I1101 10:54:04.767061 23007 net.cpp:428] concat8_concat8_0_split <- concat8 I1101 10:54:04.767072 23007 net.cpp:402] concat8_concat8_0_split -> concat8_concat8_0_split_0 I1101 10:54:04.767096 23007 net.cpp:402] concat8_concat8_0_split -> concat8_concat8_0_split_1 I1101 10:54:04.767151 23007 net.cpp:144] Setting up concat8_concat8_0_split I1101 10:54:04.767175 23007 net.cpp:151] Top shape: 1 768 24 60 (1105920) I1101 10:54:04.767182 23007 net.cpp:151] Top shape: 1 768 24 60 (1105920) I1101 10:54:04.767187 23007 net.cpp:159] Memory required for data: 292700160 I1101 10:54:04.767215 23007 net.cpp:94] Creating Layer conv9 I1101 10:54:04.767235 23007 net.cpp:428] conv9 <- concat8_concat8_0_split_0 I1101 10:54:04.767247 23007 net.cpp:402] conv9 -> conv9 I1101 10:54:04.772697 23007 net.cpp:144] Setting up conv9 I1101 10:54:04.772728 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.772734 23007 net.cpp:159] Memory required for data: 295649280 I1101 10:54:04.772773 23007 net.cpp:94] Creating Layer conv9_relu I1101 10:54:04.772783 23007 net.cpp:428] conv9_relu <- conv9 I1101 10:54:04.772809 23007 net.cpp:389] conv9_relu -> conv9 (in-place) I1101 10:54:04.773438 23007 net.cpp:144] Setting up conv9_relu I1101 10:54:04.773452 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.773458 23007 net.cpp:159] Memory required for data: 298598400 I1101 10:54:04.773483 23007 net.cpp:94] Creating Layer conv9_conv9_relu_0_split I1101 10:54:04.773490 23007 net.cpp:428] conv9_conv9_relu_0_split <- conv9 I1101 10:54:04.773501 23007 net.cpp:402] conv9_conv9_relu_0_split -> conv9_conv9_relu_0_split_0 I1101 10:54:04.773526 23007 net.cpp:402] conv9_conv9_relu_0_split -> conv9_conv9_relu_0_split_1 I1101 10:54:04.773548 23007 net.cpp:402] conv9_conv9_relu_0_split -> conv9_conv9_relu_0_split_2 I1101 10:54:04.773572 23007 net.cpp:402] conv9_conv9_relu_0_split -> conv9_conv9_relu_0_split_3 I1101 10:54:04.773579 23007 net.cpp:402] conv9_conv9_relu_0_split -> conv9_conv9_relu_0_split_4 I1101 10:54:04.773664 23007 net.cpp:144] Setting up conv9_conv9_relu_0_split I1101 10:54:04.773675 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.773694 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.773715 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.773721 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.773741 23007 net.cpp:151] Top shape: 1 512 24 60 (737280) I1101 10:54:04.773746 23007 net.cpp:159] Memory required for data: 313344000 I1101 10:54:04.773777 23007 net.cpp:94] Creating Layer conv_final I1101 10:54:04.773795 23007 net.cpp:428] conv_final <- conv9_conv9_relu_0_split_0 I1101 10:54:04.773805 23007 net.cpp:402] conv_final -> conv_final I1101 10:54:04.774983 23007 net.cpp:144] Setting up conv_final I1101 10:54:04.774998 23007 net.cpp:151] Top shape: 1 144 24 60 (207360) I1101 10:54:04.775003 23007 net.cpp:159] Memory required for data: 314173440 I1101 10:54:04.775034 23007 net.cpp:94] Creating Layer conv_final_permute I1101 10:54:04.775041 23007 net.cpp:428] conv_final_permute <- conv_final I1101 10:54:04.775065 23007 net.cpp:402] conv_final_permute -> conv_final_permute I1101 10:54:04.775240 23007 net.cpp:144] Setting up conv_final_permute I1101 10:54:04.775250 23007 net.cpp:151] Top shape: 1 24 60 144 (207360) I1101 10:54:04.775255 23007 net.cpp:159] Memory required for data: 315002880 I1101 10:54:04.775310 23007 net.cpp:94] Creating Layer slice I1101 10:54:04.775331 23007 net.cpp:428] slice <- conv_final_permute I1101 10:54:04.775357 23007 net.cpp:402] slice -> loc_pred I1101 10:54:04.775375 23007 net.cpp:402] slice -> obj_perm I1101 10:54:04.775388 23007 net.cpp:402] slice -> cls_perm I1101 10:54:04.775482 23007 net.cpp:144] Setting up slice I1101 10:54:04.775492 23007 net.cpp:151] Top shape: 1 24 60 64 (92160) I1101 10:54:04.775512 23007 net.cpp:151] Top shape: 1 24 60 16 (23040) I1101 10:54:04.775522 23007 net.cpp:151] Top shape: 1 24 60 64 (92160) I1101 10:54:04.775527 23007 net.cpp:159] Memory required for data: 315832320 I1101 10:54:04.775549 23007 net.cpp:94] Creating Layer cls_reshape I1101 10:54:04.775555 23007 net.cpp:428] cls_reshape <- cls_perm I1101 10:54:04.775565 23007 net.cpp:402] cls_reshape -> cls_reshape I1101 10:54:04.775605 23007 net.cpp:144] Setting up cls_reshape I1101 10:54:04.775616 23007 net.cpp:151] Top shape: 1 24 960 4 (92160) I1101 10:54:04.775621 23007 net.cpp:159] Memory required for data: 316200960 I1101 10:54:04.775647 23007 net.cpp:94] Creating Layer cls_pred_prob I1101 10:54:04.775653 23007 net.cpp:428] cls_pred_prob <- cls_reshape I1101 10:54:04.775676 23007 net.cpp:402] cls_pred_prob -> cls_pred_prob I1101 10:54:04.775907 23007 net.cpp:144] Setting up cls_pred_prob I1101 10:54:04.775920 23007 net.cpp:151] Top shape: 1 24 960 4 (92160) I1101 10:54:04.775926 23007 net.cpp:159] Memory required for data: 316569600 I1101 10:54:04.775935 23007 net.cpp:94] Creating Layer cls_pred I1101 10:54:04.775941 23007 net.cpp:428] cls_pred <- cls_pred_prob I1101 10:54:04.775951 23007 net.cpp:402] cls_pred -> cls_pred I1101 10:54:04.775975 23007 net.cpp:144] Setting up cls_pred I1101 10:54:04.775985 23007 net.cpp:151] Top shape: 1 24 60 64 (92160) I1101 10:54:04.775995 23007 net.cpp:159] Memory required for data: 316938240 I1101 10:54:04.776010 23007 net.cpp:94] Creating Layer obj_pred I1101 10:54:04.776016 23007 net.cpp:428] obj_pred <- obj_perm I1101 10:54:04.776026 23007 net.cpp:402] obj_pred -> obj_pred I1101 10:54:04.776211 23007 net.cpp:144] Setting up obj_pred I1101 10:54:04.776222 23007 net.cpp:151] Top shape: 1 24 60 16 (23040) I1101 10:54:04.776228 23007 net.cpp:159] Memory required for data: 317030400 I1101 10:54:04.776240 23007 net.cpp:94] Creating Layer ori_origin I1101 10:54:04.776247 23007 net.cpp:428] ori_origin <- conv9_conv9_relu_0_split_1 I1101 10:54:04.776258 23007 net.cpp:402] ori_origin -> ori_origin I1101 10:54:04.777323 23007 net.cpp:144] Setting up ori_origin I1101 10:54:04.777338 23007 net.cpp:151] Top shape: 1 32 24 60 (46080) I1101 10:54:04.777343 23007 net.cpp:159] Memory required for data: 317214720 I1101 10:54:04.777359 23007 net.cpp:94] Creating Layer ori_pred I1101 10:54:04.777366 23007 net.cpp:428] ori_pred <- ori_origin I1101 10:54:04.777375 23007 net.cpp:402] ori_pred -> ori_pred I1101 10:54:04.777468 23007 net.cpp:144] Setting up ori_pred I1101 10:54:04.777478 23007 net.cpp:151] Top shape: 1 24 60 32 (46080) I1101 10:54:04.777484 23007 net.cpp:159] Memory required for data: 317399040 I1101 10:54:04.777494 23007 net.cpp:94] Creating Layer dim_origin I1101 10:54:04.777500 23007 net.cpp:428] dim_origin <- conv9_conv9_relu_0_split_2 I1101 10:54:04.777510 23007 net.cpp:402] dim_origin -> dim_origin I1101 10:54:04.778553 23007 net.cpp:144] Setting up dim_origin I1101 10:54:04.778570 23007 net.cpp:151] Top shape: 1 48 24 60 (69120) I1101 10:54:04.778575 23007 net.cpp:159] Memory required for data: 317675520 I1101 10:54:04.778589 23007 net.cpp:94] Creating Layer dim_pred I1101 10:54:04.778596 23007 net.cpp:428] dim_pred <- dim_origin I1101 10:54:04.778606 23007 net.cpp:402] dim_pred -> dim_pred I1101 10:54:04.778712 23007 net.cpp:144] Setting up dim_pred I1101 10:54:04.778735 23007 net.cpp:151] Top shape: 1 24 60 48 (69120) I1101 10:54:04.778741 23007 net.cpp:159] Memory required for data: 317952000 I1101 10:54:04.778753 23007 net.cpp:94] Creating Layer lof_origin I1101 10:54:04.778759 23007 net.cpp:428] lof_origin <- conv9_conv9_relu_0_split_3 I1101 10:54:04.778767 23007 net.cpp:402] lof_origin -> lof_origin I1101 10:54:04.779851 23007 net.cpp:144] Setting up lof_origin I1101 10:54:04.779867 23007 net.cpp:151] Top shape: 1 64 24 60 (92160) I1101 10:54:04.779873 23007 net.cpp:159] Memory required for data: 318320640 I1101 10:54:04.779888 23007 net.cpp:94] Creating Layer lof_perm I1101 10:54:04.779896 23007 net.cpp:428] lof_perm <- lof_origin I1101 10:54:04.779906 23007 net.cpp:402] lof_perm -> lof_pred I1101 10:54:04.779999 23007 net.cpp:144] Setting up lof_perm I1101 10:54:04.780009 23007 net.cpp:151] Top shape: 1 24 60 64 (92160) I1101 10:54:04.780014 23007 net.cpp:159] Memory required for data: 318689280 I1101 10:54:04.780025 23007 net.cpp:94] Creating Layer lor_origin I1101 10:54:04.780031 23007 net.cpp:428] lor_origin <- conv9_conv9_relu_0_split_4 I1101 10:54:04.780041 23007 net.cpp:402] lor_origin -> lor_origin I1101 10:54:04.781088 23007 net.cpp:144] Setting up lor_origin I1101 10:54:04.781103 23007 net.cpp:151] Top shape: 1 64 24 60 (92160) I1101 10:54:04.781110 23007 net.cpp:159] Memory required for data: 319057920 I1101 10:54:04.781124 23007 net.cpp:94] Creating Layer lor_perm I1101 10:54:04.781131 23007 net.cpp:428] lor_perm <- lor_origin I1101 10:54:04.781141 23007 net.cpp:402] lor_perm -> lor_pred I1101 10:54:04.781262 23007 net.cpp:144] Setting up lor_perm I1101 10:54:04.781272 23007 net.cpp:151] Top shape: 1 24 60 64 (92160) I1101 10:54:04.781277 23007 net.cpp:159] Memory required for data: 319426560 I1101 10:54:04.781288 23007 net.cpp:94] Creating Layer reduce1_lane I1101 10:54:04.781294 23007 net.cpp:428] reduce1_lane <- concat8_concat8_0_split_1 I1101 10:54:04.781304 23007 net.cpp:402] reduce1_lane -> reduce1_lane I1101 10:54:04.783798 23007 net.cpp:144] Setting up reduce1_lane I1101 10:54:04.783823 23007 net.cpp:151] Top shape: 1 128 24 60 (184320) I1101 10:54:04.783829 23007 net.cpp:159] Memory required for data: 320163840 I1101 10:54:04.783848 23007 net.cpp:94] Creating Layer reduce1_lane_relu I1101 10:54:04.783855 23007 net.cpp:428] reduce1_lane_relu <- reduce1_lane I1101 10:54:04.783866 23007 net.cpp:389] reduce1_lane_relu -> reduce1_lane (in-place) I1101 10:54:04.784371 23007 net.cpp:144] Setting up reduce1_lane_relu I1101 10:54:04.784385 23007 net.cpp:151] Top shape: 1 128 24 60 (184320) I1101 10:54:04.784391 23007 net.cpp:159] Memory required for data: 320901120 I1101 10:54:04.784407 23007 net.cpp:94] Creating Layer deconv1_lane I1101 10:54:04.784415 23007 net.cpp:428] deconv1_lane <- reduce1_lane I1101 10:54:04.784426 23007 net.cpp:402] deconv1_lane -> deconv1_lane I1101 10:54:04.784678 23007 net.cpp:144] Setting up deconv1_lane I1101 10:54:04.784692 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.784696 23007 net.cpp:159] Memory required for data: 322375680 I1101 10:54:04.784723 23007 net.cpp:94] Creating Layer deconv1_lane_relu I1101 10:54:04.784729 23007 net.cpp:428] deconv1_lane_relu <- deconv1_lane I1101 10:54:04.784739 23007 net.cpp:389] deconv1_lane_relu -> deconv1_lane (in-place) I1101 10:54:04.784915 23007 net.cpp:144] Setting up deconv1_lane_relu I1101 10:54:04.784926 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.784932 23007 net.cpp:159] Memory required for data: 323850240 I1101 10:54:04.784943 23007 net.cpp:94] Creating Layer reorg4 I1101 10:54:04.784950 23007 net.cpp:428] reorg4 <- conv4_3_conv4_3_relu_0_split_1 I1101 10:54:04.784961 23007 net.cpp:402] reorg4 -> reorg4 I1101 10:54:04.786300 23007 net.cpp:144] Setting up reorg4 I1101 10:54:04.786316 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.786324 23007 net.cpp:159] Memory required for data: 325324800 I1101 10:54:04.786335 23007 net.cpp:94] Creating Layer reorg4_relu I1101 10:54:04.786342 23007 net.cpp:428] reorg4_relu <- reorg4 I1101 10:54:04.786365 23007 net.cpp:389] reorg4_relu -> reorg4 (in-place) I1101 10:54:04.786559 23007 net.cpp:144] Setting up reorg4_relu I1101 10:54:04.786571 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.786577 23007 net.cpp:159] Memory required for data: 326799360 I1101 10:54:04.786586 23007 net.cpp:94] Creating Layer concat4 I1101 10:54:04.786592 23007 net.cpp:428] concat4 <- reorg4 I1101 10:54:04.786602 23007 net.cpp:428] concat4 <- deconv1_lane I1101 10:54:04.786612 23007 net.cpp:402] concat4 -> concat4 I1101 10:54:04.786667 23007 net.cpp:144] Setting up concat4 I1101 10:54:04.786679 23007 net.cpp:151] Top shape: 1 128 48 120 (737280) I1101 10:54:04.786685 23007 net.cpp:159] Memory required for data: 329748480 I1101 10:54:04.786708 23007 net.cpp:94] Creating Layer reduce2_lane I1101 10:54:04.786715 23007 net.cpp:428] reduce2_lane <- concat4 I1101 10:54:04.786723 23007 net.cpp:402] reduce2_lane -> reduce2_lane I1101 10:54:04.788107 23007 net.cpp:144] Setting up reduce2_lane I1101 10:54:04.788123 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.788130 23007 net.cpp:159] Memory required for data: 331223040 I1101 10:54:04.788143 23007 net.cpp:94] Creating Layer reduce2_lane_relu I1101 10:54:04.788151 23007 net.cpp:428] reduce2_lane_relu <- reduce2_lane I1101 10:54:04.788159 23007 net.cpp:389] reduce2_lane_relu -> reduce2_lane (in-place) I1101 10:54:04.788637 23007 net.cpp:144] Setting up reduce2_lane_relu I1101 10:54:04.788651 23007 net.cpp:151] Top shape: 1 64 48 120 (368640) I1101 10:54:04.788671 23007 net.cpp:159] Memory required for data: 332697600 I1101 10:54:04.788682 23007 net.cpp:94] Creating Layer deconv2_lane I1101 10:54:04.788689 23007 net.cpp:428] deconv2_lane <- reduce2_lane I1101 10:54:04.788700 23007 net.cpp:402] deconv2_lane -> deconv2_lane I1101 10:54:04.788946 23007 net.cpp:144] Setting up deconv2_lane I1101 10:54:04.788959 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.788964 23007 net.cpp:159] Memory required for data: 335646720 I1101 10:54:04.788976 23007 net.cpp:94] Creating Layer deconv2_lane_relu I1101 10:54:04.788982 23007 net.cpp:428] deconv2_lane_relu <- deconv2_lane I1101 10:54:04.788991 23007 net.cpp:389] deconv2_lane_relu -> deconv2_lane (in-place) I1101 10:54:04.789494 23007 net.cpp:144] Setting up deconv2_lane_relu I1101 10:54:04.789507 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.789512 23007 net.cpp:159] Memory required for data: 338595840 I1101 10:54:04.789525 23007 net.cpp:94] Creating Layer reorg3 I1101 10:54:04.789532 23007 net.cpp:428] reorg3 <- conv3_3_conv3_3_relu_0_split_1 I1101 10:54:04.789542 23007 net.cpp:402] reorg3 -> reorg3 I1101 10:54:04.790568 23007 net.cpp:144] Setting up reorg3 I1101 10:54:04.790585 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.790591 23007 net.cpp:159] Memory required for data: 341544960 I1101 10:54:04.790603 23007 net.cpp:94] Creating Layer reorg3_relu I1101 10:54:04.790611 23007 net.cpp:428] reorg3_relu <- reorg3 I1101 10:54:04.790619 23007 net.cpp:389] reorg3_relu -> reorg3 (in-place) I1101 10:54:04.790804 23007 net.cpp:144] Setting up reorg3_relu I1101 10:54:04.790817 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.790822 23007 net.cpp:159] Memory required for data: 344494080 I1101 10:54:04.790832 23007 net.cpp:94] Creating Layer concat3 I1101 10:54:04.790836 23007 net.cpp:428] concat3 <- reorg3 I1101 10:54:04.790845 23007 net.cpp:428] concat3 <- deconv2_lane I1101 10:54:04.790854 23007 net.cpp:402] concat3 -> concat3 I1101 10:54:04.790880 23007 net.cpp:144] Setting up concat3 I1101 10:54:04.790907 23007 net.cpp:151] Top shape: 1 64 96 240 (1474560) I1101 10:54:04.790913 23007 net.cpp:159] Memory required for data: 350392320 I1101 10:54:04.790925 23007 net.cpp:94] Creating Layer reduce3_lane I1101 10:54:04.790931 23007 net.cpp:428] reduce3_lane <- concat3 I1101 10:54:04.790940 23007 net.cpp:402] reduce3_lane -> reduce3_lane I1101 10:54:04.792011 23007 net.cpp:144] Setting up reduce3_lane I1101 10:54:04.792040 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.792060 23007 net.cpp:159] Memory required for data: 353341440 I1101 10:54:04.792073 23007 net.cpp:94] Creating Layer reduce3_lane_relu I1101 10:54:04.792080 23007 net.cpp:428] reduce3_lane_relu <- reduce3_lane I1101 10:54:04.792089 23007 net.cpp:389] reduce3_lane_relu -> reduce3_lane (in-place) I1101 10:54:04.792289 23007 net.cpp:144] Setting up reduce3_lane_relu I1101 10:54:04.792301 23007 net.cpp:151] Top shape: 1 32 96 240 (737280) I1101 10:54:04.792306 23007 net.cpp:159] Memory required for data: 356290560 I1101 10:54:04.792330 23007 net.cpp:94] Creating Layer deconv3_lane I1101 10:54:04.792336 23007 net.cpp:428] deconv3_lane <- reduce3_lane I1101 10:54:04.792361 23007 net.cpp:402] deconv3_lane -> deconv3_lane I1101 10:54:04.793166 23007 net.cpp:144] Setting up deconv3_lane I1101 10:54:04.793182 23007 net.cpp:151] Top shape: 1 16 192 480 (1474560) I1101 10:54:04.793190 23007 net.cpp:159] Memory required for data: 362188800 I1101 10:54:04.793229 23007 net.cpp:94] Creating Layer deconv3_lane_relu I1101 10:54:04.793237 23007 net.cpp:428] deconv3_lane_relu <- deconv3_lane I1101 10:54:04.793246 23007 net.cpp:389] deconv3_lane_relu -> deconv3_lane (in-place) I1101 10:54:04.793758 23007 net.cpp:144] Setting up deconv3_lane_relu I1101 10:54:04.793773 23007 net.cpp:151] Top shape: 1 16 192 480 (1474560) I1101 10:54:04.793779 23007 net.cpp:159] Memory required for data: 368087040 I1101 10:54:04.793790 23007 net.cpp:94] Creating Layer reorg2 I1101 10:54:04.793797 23007 net.cpp:428] reorg2 <- conv2_conv2_relu_0_split_1 I1101 10:54:04.793807 23007 net.cpp:402] reorg2 -> reorg2 I1101 10:54:04.794873 23007 net.cpp:144] Setting up reorg2 I1101 10:54:04.794889 23007 net.cpp:151] Top shape: 1 16 192 480 (1474560) I1101 10:54:04.794896 23007 net.cpp:159] Memory required for data: 373985280 I1101 10:54:04.794909 23007 net.cpp:94] Creating Layer reorg2_relu I1101 10:54:04.794914 23007 net.cpp:428] reorg2_relu <- reorg2 I1101 10:54:04.794924 23007 net.cpp:389] reorg2_relu -> reorg2 (in-place) I1101 10:54:04.795125 23007 net.cpp:144] Setting up reorg2_relu I1101 10:54:04.795136 23007 net.cpp:151] Top shape: 1 16 192 480 (1474560) I1101 10:54:04.795156 23007 net.cpp:159] Memory required for data: 379883520 I1101 10:54:04.795164 23007 net.cpp:94] Creating Layer concat2 I1101 10:54:04.795171 23007 net.cpp:428] concat2 <- reorg2 I1101 10:54:04.795179 23007 net.cpp:428] concat2 <- deconv3_lane I1101 10:54:04.795188 23007 net.cpp:402] concat2 -> concat2 I1101 10:54:04.795220 23007 net.cpp:144] Setting up concat2 I1101 10:54:04.795230 23007 net.cpp:151] Top shape: 1 32 192 480 (2949120) I1101 10:54:04.795248 23007 net.cpp:159] Memory required for data: 391680000 I1101 10:54:04.795259 23007 net.cpp:94] Creating Layer reduce4_lane I1101 10:54:04.795265 23007 net.cpp:428] reduce4_lane <- concat2 I1101 10:54:04.795275 23007 net.cpp:402] reduce4_lane -> reduce4_lane I1101 10:54:04.796396 23007 net.cpp:144] Setting up reduce4_lane I1101 10:54:04.796412 23007 net.cpp:151] Top shape: 1 16 192 480 (1474560) I1101 10:54:04.796418 23007 net.cpp:159] Memory required for data: 397578240 I1101 10:54:04.796432 23007 net.cpp:94] Creating Layer reduce4_lane_relu I1101 10:54:04.796438 23007 net.cpp:428] reduce4_lane_relu <- reduce4_lane I1101 10:54:04.796447 23007 net.cpp:389] reduce4_lane_relu -> reduce4_lane (in-place) I1101 10:54:04.796623 23007 net.cpp:144] Setting up reduce4_lane_relu I1101 10:54:04.796635 23007 net.cpp:151] Top shape: 1 16 192 480 (1474560) I1101 10:54:04.796640 23007 net.cpp:159] Memory required for data: 403476480 I1101 10:54:04.796651 23007 net.cpp:94] Creating Layer deconv4_lane I1101 10:54:04.796658 23007 net.cpp:428] deconv4_lane <- reduce4_lane I1101 10:54:04.796681 23007 net.cpp:402] deconv4_lane -> deconv4_lane I1101 10:54:04.797560 23007 net.cpp:144] Setting up deconv4_lane I1101 10:54:04.797590 23007 net.cpp:151] Top shape: 1 8 384 960 (2949120) I1101 10:54:04.797596 23007 net.cpp:159] Memory required for data: 415272960 I1101 10:54:04.797610 23007 net.cpp:94] Creating Layer deconv4_lane_relu I1101 10:54:04.797617 23007 net.cpp:428] deconv4_lane_relu <- deconv4_lane I1101 10:54:04.797626 23007 net.cpp:389] deconv4_lane_relu -> deconv4_lane (in-place) I1101 10:54:04.798153 23007 net.cpp:144] Setting up deconv4_lane_relu I1101 10:54:04.798168 23007 net.cpp:151] Top shape: 1 8 384 960 (2949120) I1101 10:54:04.798174 23007 net.cpp:159] Memory required for data: 427069440 I1101 10:54:04.798187 23007 net.cpp:94] Creating Layer reorg1 I1101 10:54:04.798193 23007 net.cpp:428] reorg1 <- conv1_conv1_relu_0_split_1 I1101 10:54:04.798218 23007 net.cpp:402] reorg1 -> reorg1 I1101 10:54:04.799254 23007 net.cpp:144] Setting up reorg1 I1101 10:54:04.799269 23007 net.cpp:151] Top shape: 1 8 384 960 (2949120) I1101 10:54:04.799276 23007 net.cpp:159] Memory required for data: 438865920 I1101 10:54:04.799288 23007 net.cpp:94] Creating Layer reorg1_relu I1101 10:54:04.799294 23007 net.cpp:428] reorg1_relu <- reorg1 I1101 10:54:04.799304 23007 net.cpp:389] reorg1_relu -> reorg1 (in-place) I1101 10:54:04.799506 23007 net.cpp:144] Setting up reorg1_relu I1101 10:54:04.799518 23007 net.cpp:151] Top shape: 1 8 384 960 (2949120) I1101 10:54:04.799523 23007 net.cpp:159] Memory required for data: 450662400 I1101 10:54:04.799533 23007 net.cpp:94] Creating Layer concat1 I1101 10:54:04.799540 23007 net.cpp:428] concat1 <- reorg1 I1101 10:54:04.799547 23007 net.cpp:428] concat1 <- deconv4_lane I1101 10:54:04.799557 23007 net.cpp:402] concat1 -> concat1 I1101 10:54:04.799583 23007 net.cpp:144] Setting up concat1 I1101 10:54:04.799597 23007 net.cpp:151] Top shape: 1 16 384 960 (5898240) I1101 10:54:04.799602 23007 net.cpp:159] Memory required for data: 474255360 I1101 10:54:04.799614 23007 net.cpp:94] Creating Layer conv_out I1101 10:54:04.799620 23007 net.cpp:428] conv_out <- concat1 I1101 10:54:04.799629 23007 net.cpp:402] conv_out -> conv_out I1101 10:54:04.800640 23007 net.cpp:144] Setting up conv_out I1101 10:54:04.800655 23007 net.cpp:151] Top shape: 1 4 384 960 (1474560) I1101 10:54:04.800662 23007 net.cpp:159] Memory required for data: 480153600 I1101 10:54:04.800673 23007 net.cpp:94] Creating Layer seg_prob I1101 10:54:04.800681 23007 net.cpp:428] seg_prob <- conv_out I1101 10:54:04.800690 23007 net.cpp:402] seg_prob -> seg_prob I1101 10:54:04.800940 23007 net.cpp:144] Setting up seg_prob I1101 10:54:04.800954 23007 net.cpp:151] Top shape: 1 4 384 960 (1474560) I1101 10:54:04.800961 23007 net.cpp:159] Memory required for data: 486051840 I1101 10:54:04.800966 23007 net.cpp:222] seg_prob does not need backward computation. I1101 10:54:04.800971 23007 net.cpp:222] conv_out does not need backward computation. I1101 10:54:04.800976 23007 net.cpp:222] concat1 does not need backward computation. I1101 10:54:04.800982 23007 net.cpp:222] reorg1_relu does not need backward computation. I1101 10:54:04.800987 23007 net.cpp:222] reorg1 does not need backward computation. I1101 10:54:04.800992 23007 net.cpp:222] deconv4_lane_relu does not need backward computation. I1101 10:54:04.800997 23007 net.cpp:222] deconv4_lane does not need backward computation. I1101 10:54:04.801002 23007 net.cpp:222] reduce4_lane_relu does not need backward computation. I1101 10:54:04.801007 23007 net.cpp:222] reduce4_lane does not need backward computation. I1101 10:54:04.801025 23007 net.cpp:222] concat2 does not need backward computation. I1101 10:54:04.801031 23007 net.cpp:222] reorg2_relu does not need backward computation. I1101 10:54:04.801038 23007 net.cpp:222] reorg2 does not need backward computation. I1101 10:54:04.801045 23007 net.cpp:222] deconv3_lane_relu does not need backward computation. I1101 10:54:04.801051 23007 net.cpp:222] deconv3_lane does not need backward computation. I1101 10:54:04.801056 23007 net.cpp:222] reduce3_lane_relu does not need backward computation. I1101 10:54:04.801061 23007 net.cpp:222] reduce3_lane does not need backward computation. I1101 10:54:04.801066 23007 net.cpp:222] concat3 does not need backward computation. I1101 10:54:04.801072 23007 net.cpp:222] reorg3_relu does not need backward computation. I1101 10:54:04.801077 23007 net.cpp:222] reorg3 does not need backward computation. I1101 10:54:04.801082 23007 net.cpp:222] deconv2_lane_relu does not need backward computation. I1101 10:54:04.801087 23007 net.cpp:222] deconv2_lane does not need backward computation. I1101 10:54:04.801093 23007 net.cpp:222] reduce2_lane_relu does not need backward computation. I1101 10:54:04.801098 23007 net.cpp:222] reduce2_lane does not need backward computation. I1101 10:54:04.801105 23007 net.cpp:222] concat4 does not need backward computation. I1101 10:54:04.801111 23007 net.cpp:222] reorg4_relu does not need backward computation. I1101 10:54:04.801116 23007 net.cpp:222] reorg4 does not need backward computation. I1101 10:54:04.801122 23007 net.cpp:222] deconv1_lane_relu does not need backward computation. I1101 10:54:04.801141 23007 net.cpp:222] deconv1_lane does not need backward computation. I1101 10:54:04.801146 23007 net.cpp:222] reduce1_lane_relu does not need backward computation. I1101 10:54:04.801152 23007 net.cpp:222] reduce1_lane does not need backward computation. I1101 10:54:04.801157 23007 net.cpp:222] lor_perm does not need backward computation. I1101 10:54:04.801163 23007 net.cpp:222] lor_origin does not need backward computation. I1101 10:54:04.801168 23007 net.cpp:222] lof_perm does not need backward computation. I1101 10:54:04.801174 23007 net.cpp:222] lof_origin does not need backward computation. I1101 10:54:04.801182 23007 net.cpp:222] dim_pred does not need backward computation. I1101 10:54:04.801187 23007 net.cpp:222] dim_origin does not need backward computation. I1101 10:54:04.801193 23007 net.cpp:222] ori_pred does not need backward computation. I1101 10:54:04.801198 23007 net.cpp:222] ori_origin does not need backward computation. I1101 10:54:04.801204 23007 net.cpp:222] obj_pred does not need backward computation. I1101 10:54:04.801209 23007 net.cpp:222] cls_pred does not need backward computation. I1101 10:54:04.801215 23007 net.cpp:222] cls_pred_prob does not need backward computation. I1101 10:54:04.801234 23007 net.cpp:222] cls_reshape does not need backward computation. I1101 10:54:04.801239 23007 net.cpp:222] slice does not need backward computation. I1101 10:54:04.801260 23007 net.cpp:222] conv_final_permute does not need backward computation. I1101 10:54:04.801266 23007 net.cpp:222] conv_final does not need backward computation. I1101 10:54:04.801272 23007 net.cpp:222] conv9_conv9_relu_0_split does not need backward computation. I1101 10:54:04.801278 23007 net.cpp:222] conv9_relu does not need backward computation. I1101 10:54:04.801283 23007 net.cpp:222] conv9 does not need backward computation. I1101 10:54:04.801290 23007 net.cpp:222] concat8_concat8_0_split does not need backward computation. I1101 10:54:04.801295 23007 net.cpp:222] concat8 does not need backward computation. I1101 10:54:04.801301 23007 net.cpp:222] conv7_2_relu does not need backward computation. I1101 10:54:04.801319 23007 net.cpp:222] conv7_2 does not need backward computation. I1101 10:54:04.801326 23007 net.cpp:222] conv7_1_relu does not need backward computation. I1101 10:54:04.801332 23007 net.cpp:222] conv7_1 does not need backward computation. I1101 10:54:04.801337 23007 net.cpp:222] conv6_5_relu does not need backward computation. I1101 10:54:04.801343 23007 net.cpp:222] conv6_5 does not need backward computation. I1101 10:54:04.801348 23007 net.cpp:222] conv6_4_relu does not need backward computation. I1101 10:54:04.801355 23007 net.cpp:222] conv6_4 does not need backward computation. I1101 10:54:04.801360 23007 net.cpp:222] conv6_3_relu does not need backward computation. I1101 10:54:04.801367 23007 net.cpp:222] conv6_3 does not need backward computation. I1101 10:54:04.801371 23007 net.cpp:222] conv6_2_relu does not need backward computation. I1101 10:54:04.801376 23007 net.cpp:222] conv6_2 does not need backward computation. I1101 10:54:04.801381 23007 net.cpp:222] conv6_1_relu does not need backward computation. I1101 10:54:04.801388 23007 net.cpp:222] conv6_1_nodilate does not need backward computation. I1101 10:54:04.801393 23007 net.cpp:222] pool5 does not need backward computation. I1101 10:54:04.801398 23007 net.cpp:222] conv5_5_conv5_5_relu_0_split does not need backward computation. I1101 10:54:04.801405 23007 net.cpp:222] conv5_5_relu does not need backward computation. I1101 10:54:04.801411 23007 net.cpp:222] conv5_5 does not need backward computation. I1101 10:54:04.801416 23007 net.cpp:222] conv5_4_relu does not need backward computation. I1101 10:54:04.801422 23007 net.cpp:222] conv5_4 does not need backward computation. I1101 10:54:04.801427 23007 net.cpp:222] conv5_3_relu does not need backward computation. I1101 10:54:04.801432 23007 net.cpp:222] conv5_3 does not need backward computation. I1101 10:54:04.801439 23007 net.cpp:222] conv5_2_relu does not need backward computation. I1101 10:54:04.801443 23007 net.cpp:222] conv5_2 does not need backward computation. I1101 10:54:04.801448 23007 net.cpp:222] conv5_1_relu does not need backward computation. I1101 10:54:04.801455 23007 net.cpp:222] conv5_1 does not need backward computation. I1101 10:54:04.801460 23007 net.cpp:222] pool4 does not need backward computation. I1101 10:54:04.801465 23007 net.cpp:222] conv4_3_conv4_3_relu_0_split does not need backward computation. I1101 10:54:04.801471 23007 net.cpp:222] conv4_3_relu does not need backward computation. I1101 10:54:04.801492 23007 net.cpp:222] conv4_3 does not need backward computation. I1101 10:54:04.801497 23007 net.cpp:222] conv4_2_relu does not need backward computation. I1101 10:54:04.801517 23007 net.cpp:222] conv4_2 does not need backward computation. I1101 10:54:04.801522 23007 net.cpp:222] conv4_1_relu does not need backward computation. I1101 10:54:04.801527 23007 net.cpp:222] conv4_1 does not need backward computation. I1101 10:54:04.801533 23007 net.cpp:222] pool3 does not need backward computation. I1101 10:54:04.801538 23007 net.cpp:222] conv3_3_conv3_3_relu_0_split does not need backward computation. I1101 10:54:04.801546 23007 net.cpp:222] conv3_3_relu does not need backward computation. I1101 10:54:04.801553 23007 net.cpp:222] conv3_3 does not need backward computation. I1101 10:54:04.801558 23007 net.cpp:222] conv3_2_relu does not need backward computation. I1101 10:54:04.801563 23007 net.cpp:222] conv3_2 does not need backward computation. I1101 10:54:04.801568 23007 net.cpp:222] conv3_1_relu does not need backward computation. I1101 10:54:04.801573 23007 net.cpp:222] conv3_1 does not need backward computation. I1101 10:54:04.801579 23007 net.cpp:222] pool2 does not need backward computation. I1101 10:54:04.801584 23007 net.cpp:222] conv2_conv2_relu_0_split does not need backward computation. I1101 10:54:04.801590 23007 net.cpp:222] conv2_relu does not need backward computation. I1101 10:54:04.801595 23007 net.cpp:222] conv2 does not need backward computation. I1101 10:54:04.801601 23007 net.cpp:222] pool1 does not need backward computation. I1101 10:54:04.801609 23007 net.cpp:222] conv1_conv1_relu_0_split does not need backward computation. I1101 10:54:04.801614 23007 net.cpp:222] conv1_relu does not need backward computation. I1101 10:54:04.801620 23007 net.cpp:222] conv1 does not need backward computation. I1101 10:54:04.801625 23007 net.cpp:222] data_scale does not need backward computation. I1101 10:54:04.801631 23007 net.cpp:222] data_perm does not need backward computation. I1101 10:54:04.801636 23007 net.cpp:222] input does not need backward computation. I1101 10:54:04.801641 23007 net.cpp:264] This network produces output cls_pred I1101 10:54:04.801692 23007 net.cpp:264] This network produces output dim_pred I1101 10:54:04.801700 23007 net.cpp:264] This network produces output loc_pred I1101 10:54:04.801707 23007 net.cpp:264] This network produces output lof_pred I1101 10:54:04.801712 23007 net.cpp:264] This network produces output lor_pred I1101 10:54:04.801718 23007 net.cpp:264] This network produces output obj_pred I1101 10:54:04.801723 23007 net.cpp:264] This network produces output ori_pred I1101 10:54:04.801741 23007 net.cpp:264] This network produces output seg_prob I1101 10:54:04.801811 23007 net.cpp:277] Network initialization done. I1101 10:54:04.936751 23007 common.cpp:177] Device id: 0 I1101 10:54:04.936771 23007 common.cpp:178] Major revision number: 6 I1101 10:54:04.936776 23007 common.cpp:179] Minor revision number: 1 I1101 10:54:04.936780 23007 common.cpp:180] Name: GeForce GTX 1080 I1101 10:54:04.936800 23007 common.cpp:181] Total global memory: 8499691520 I1101 10:54:04.936803 23007 common.cpp:182] Total shared memory per block: 49152 I1101 10:54:04.936807 23007 common.cpp:183] Total registers per block: 65536 I1101 10:54:04.936811 23007 common.cpp:184] Warp size: 32 I1101 10:54:04.936815 23007 common.cpp:185] Maximum memory pitch: 2147483647 I1101 10:54:04.936820 23007 common.cpp:186] Maximum threads per block: 1024 I1101 10:54:04.936839 23007 common.cpp:187] Maximum dimension of block: 1024, 1024, 64 I1101 10:54:04.936846 23007 common.cpp:190] Maximum dimension of grid: 2147483647, 65535, 65535 I1101 10:54:04.936863 23007 common.cpp:193] Clock rate: 1809500 I1101 10:54:04.936868 23007 common.cpp:194] Total constant memory: 65536 I1101 10:54:04.936887 23007 common.cpp:195] Texture alignment: 512 I1101 10:54:04.936893 23007 common.cpp:196] Concurrent copy and execution: Yes I1101 10:54:04.936897 23007 common.cpp:198] Number of multiprocessors: 20 I1101 10:54:04.936902 23007 common.cpp:199] Kernel execution timeout: Yes I1101 10:54:04.942518 23007 net.cpp:52] Initializing net from parameters: name: "darknet-16c-16x-3d multitask TEST 960x384, offset L3:440, L4: 312, RM DET" state { phase: TEST } layer { name: "input" type: "Input" top: "data" input_param { shape { dim: 1 dim: 480 dim: 640 dim: 3 } } } layer { name: "data_perm" type: "Permute" bottom: "data" top: "data_perm" permute_param { order: 0 order: 3 order: 1 order: 2 } } layer { name: "scale_data_lane" type: "Power" bottom: "data_perm" top: "scale_data_lane" propagate_down: false power_param { power: 1 scale: 0.00392157 shift: 0 } } layer { name: "conv1" type: "Convolution" bottom: "scale_data_lane" top: "conv1" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 16 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv1_bn" type: "BatchNorm" bottom: "conv1" top: "conv1" batch_norm_param { eps: 1e-06 } } layer { name: "conv1_scale" type: "Scale" bottom: "conv1" top: "conv1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv1_relu" type: "ReLU" bottom: "conv1" top: "conv1" relu_param { negative_slope: 0 } } layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv2" type: "Convolution" bottom: "pool1" top: "conv2" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 32 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv2_bn" type: "BatchNorm" bottom: "conv2" top: "conv2" batch_norm_param { eps: 1e-06 } } layer { name: "conv2_scale" type: "Scale" bottom: "conv2" top: "conv2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv2_relu" type: "ReLU" bottom: "conv2" top: "conv2" relu_param { negative_slope: 0 } } layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv3_1" type: "Convolution" bottom: "pool2" top: "conv3_1" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 64 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv3_1_bn" type: "BatchNorm" bottom: "conv3_1" top: "conv3_1" batch_norm_param { eps: 1e-06 } } layer { name: "conv3_1_scale" type: "Scale" bottom: "conv3_1" top: "conv3_1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv3_1_relu" type: "ReLU" bottom: "conv3_1" top: "conv3_1" relu_param { negative_slope: 0 } } layer { name: "conv3_2" type: "Convolution" bottom: "conv3_1" top: "conv3_2" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 32 bias_term: false pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv3_2_bn" type: "BatchNorm" bottom: "conv3_2" top: "conv3_2" batch_norm_param { eps: 1e-06 } } layer { name: "conv3_2_scale" type: "Scale" bottom: "conv3_2" top: "conv3_2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv3_2_relu" type: "ReLU" bottom: "conv3_2" top: "conv3_2" relu_param { negative_slope: 0 } } layer { name: "conv3_3" type: "Convolution" bottom: "conv3_2" top: "conv3_3" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 64 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv3_3_bn" type: "BatchNorm" bottom: "conv3_3" top: "conv3_3" batch_norm_param { eps: 1e-06 } } layer { name: "conv3_3_scale" type: "Scale" bottom: "conv3_3" top: "conv3_3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv3_3_relu" type: "ReLU" bottom: "conv3_3" top: "conv3_3" relu_param { negative_slope: 0 } } layer { name: "pool3" type: "Pooling" bottom: "conv3_3" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv4_1" type: "Convolution" bottom: "pool3" top: "conv4_1" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 128 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv4_1_bn" type: "BatchNorm" bottom: "conv4_1" top: "conv4_1" batch_norm_param { eps: 1e-06 } } layer { name: "conv4_1_scale" type: "Scale" bottom: "conv4_1" top: "conv4_1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv4_1_relu" type: "ReLU" bottom: "conv4_1" top: "conv4_1" relu_param { negative_slope: 0 } } layer { name: "conv4_2" type: "Convolution" bottom: "conv4_1" top: "conv4_2" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 64 bias_term: false pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv4_2_bn" type: "BatchNorm" bottom: "conv4_2" top: "conv4_2" batch_norm_param { eps: 1e-06 } } layer { name: "conv4_2_scale" type: "Scale" bottom: "conv4_2" top: "conv4_2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv4_2_relu" type: "ReLU" bottom: "conv4_2" top: "conv4_2" relu_param { negative_slope: 0 } } layer { name: "conv4_3" type: "Convolution" bottom: "conv4_2" top: "conv4_3" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 128 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv4_3_bn" type: "BatchNorm" bottom: "conv4_3" top: "conv4_3" batch_norm_param { eps: 1e-06 } } layer { name: "conv4_3_scale" type: "Scale" bottom: "conv4_3" top: "conv4_3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv4_3_relu" type: "ReLU" bottom: "conv4_3" top: "conv4_3" relu_param { negative_slope: 0 } } layer { name: "pool4" type: "Pooling" bottom: "conv4_3" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "conv5_1" type: "Convolution" bottom: "pool4" top: "conv5_1" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 256 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv5_1_bn" type: "BatchNorm" bottom: "conv5_1" top: "conv5_1" batch_norm_param { eps: 1e-06 } } layer { name: "conv5_1_scale" type: "Scale" bottom: "conv5_1" top: "conv5_1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv5_1_relu" type: "ReLU" bottom: "conv5_1" top: "conv5_1" relu_param { negative_slope: 0 } } layer { name: "conv5_2" type: "Convolution" bottom: "conv5_1" top: "conv5_2" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 128 bias_term: false pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv5_2_bn" type: "BatchNorm" bottom: "conv5_2" top: "conv5_2" batch_norm_param { eps: 1e-06 } } layer { name: "conv5_2_scale" type: "Scale" bottom: "conv5_2" top: "conv5_2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv5_2_relu" type: "ReLU" bottom: "conv5_2" top: "conv5_2" relu_param { negative_slope: 0 } } layer { name: "conv5_3" type: "Convolution" bottom: "conv5_2" top: "conv5_3" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 256 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv5_3_bn" type: "BatchNorm" bottom: "conv5_3" top: "conv5_3" batch_norm_param { eps: 1e-06 } } layer { name: "conv5_3_scale" type: "Scale" bottom: "conv5_3" top: "conv5_3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv5_3_relu" type: "ReLU" bottom: "conv5_3" top: "conv5_3" relu_param { negative_slope: 0 } } layer { name: "conv5_4" type: "Convolution" bottom: "conv5_3" top: "conv5_4" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 128 bias_term: false pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv5_4_bn" type: "BatchNorm" bottom: "conv5_4" top: "conv5_4" batch_norm_param { eps: 1e-06 } } layer { name: "conv5_4_scale" type: "Scale" bottom: "conv5_4" top: "conv5_4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv5_4_relu" type: "ReLU" bottom: "conv5_4" top: "conv5_4" relu_param { negative_slope: 0 } } layer { name: "conv5_5" type: "Convolution" bottom: "conv5_4" top: "conv5_5" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 256 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv5_5_bn" type: "BatchNorm" bottom: "conv5_5" top: "conv5_5" batch_norm_param { eps: 1e-06 } } layer { name: "conv5_5_scale" type: "Scale" bottom: "conv5_5" top: "conv5_5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv5_5_relu" type: "ReLU" bottom: "conv5_5" top: "conv5_5" relu_param { negative_slope: 0 } } layer { name: "pool5" type: "Pooling" bottom: "conv5_5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 1 pad: 1 } } layer { name: "conv6_1" type: "Convolution" bottom: "pool5" top: "conv6_1" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 512 bias_term: false pad: 2 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 2 } } layer { name: "conv6_1_bn" type: "BatchNorm" bottom: "conv6_1" top: "conv6_1" batch_norm_param { eps: 1e-06 } } layer { name: "conv6_1_scale" type: "Scale" bottom: "conv6_1" top: "conv6_1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv6_1_relu" type: "ReLU" bottom: "conv6_1" top: "conv6_1" relu_param { negative_slope: 0 } } layer { name: "conv6_2" type: "Convolution" bottom: "conv6_1" top: "conv6_2" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 256 bias_term: false pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv6_2_bn" type: "BatchNorm" bottom: "conv6_2" top: "conv6_2" batch_norm_param { eps: 1e-06 } } layer { name: "conv6_2_scale" type: "Scale" bottom: "conv6_2" top: "conv6_2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv6_2_relu" type: "ReLU" bottom: "conv6_2" top: "conv6_2" relu_param { negative_slope: 0 } } layer { name: "conv6_3" type: "Convolution" bottom: "conv6_2" top: "conv6_3" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 512 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv6_3_bn" type: "BatchNorm" bottom: "conv6_3" top: "conv6_3" batch_norm_param { eps: 1e-06 } } layer { name: "conv6_3_scale" type: "Scale" bottom: "conv6_3" top: "conv6_3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv6_3_relu" type: "ReLU" bottom: "conv6_3" top: "conv6_3" relu_param { negative_slope: 0 } } layer { name: "conv6_4" type: "Convolution" bottom: "conv6_3" top: "conv6_4" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 256 bias_term: false pad: 0 kernel_size: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv6_4_bn" type: "BatchNorm" bottom: "conv6_4" top: "conv6_4" batch_norm_param { eps: 1e-06 } } layer { name: "conv6_4_scale" type: "Scale" bottom: "conv6_4" top: "conv6_4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv6_4_relu" type: "ReLU" bottom: "conv6_4" top: "conv6_4" relu_param { negative_slope: 0 } } layer { name: "conv6_5" type: "Convolution" bottom: "conv6_4" top: "conv6_5" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 512 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv6_5_bn" type: "BatchNorm" bottom: "conv6_5" top: "conv6_5" batch_norm_param { eps: 1e-06 } } layer { name: "conv6_5_scale" type: "Scale" bottom: "conv6_5" top: "conv6_5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv6_5_relu" type: "ReLU" bottom: "conv6_5" top: "conv6_5" relu_param { negative_slope: 0 } } layer { name: "conv7_1" type: "Convolution" bottom: "conv6_5" top: "conv7_1" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 512 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv7_1_bn" type: "BatchNorm" bottom: "conv7_1" top: "conv7_1" batch_norm_param { eps: 1e-06 } } layer { name: "conv7_1_scale" type: "Scale" bottom: "conv7_1" top: "conv7_1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv7_1_relu" type: "ReLU" bottom: "conv7_1" top: "conv7_1" relu_param { negative_slope: 0 } } layer { name: "conv7_2" type: "Convolution" bottom: "conv7_1" top: "conv7_2" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 512 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } dilation: 1 } } layer { name: "conv7_2_bn" type: "BatchNorm" bottom: "conv7_2" top: "conv7_2" batch_norm_param { eps: 1e-06 } } layer { name: "conv7_2_scale" type: "Scale" bottom: "conv7_2" top: "conv7_2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 1 decay_mult: 1 } scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "conv7_2_relu" type: "ReLU" bottom: "conv7_2" top: "conv7_2" relu_param { negative_slope: 0 } } layer { name: "concat8" type: "Concat" bottom: "conv5_5" bottom: "conv7_2" top: "concat8" concat_param { axis: 1 } } layer { name: "reduce1_lane" type: "Convolution" bottom: "concat8" top: "reduce1_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 128 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reduce1_lane_bn" type: "BatchNorm" bottom: "reduce1_lane" top: "reduce1_lane" batch_norm_param { eps: 1e-06 } } layer { name: "reduce1_lane_scale" type: "Scale" bottom: "reduce1_lane" top: "reduce1_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "reduce1_lane_relu" type: "ReLU" bottom: "reduce1_lane" top: "reduce1_lane" relu_param { negative_slope: 0 } } layer { name: "deconv1_lane" type: "Deconvolution" bottom: "reduce1_lane" top: "deconv1_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 64 pad: 0 kernel_size: 2 stride: 2 weight_filler { type: "xavier" } bias_filler { type: "constant" value: 0 } } } layer { name: "deconv1_lane_bn" type: "BatchNorm" bottom: "deconv1_lane" top: "deconv1_lane" batch_norm_param { eps: 1e-06 } } layer { name: "deconv1_lane_scale" type: "Scale" bottom: "deconv1_lane" top: "deconv1_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "deconv1_lane_relu" type: "ReLU" bottom: "deconv1_lane" top: "deconv1_lane" relu_param { negative_slope: 0 } } layer { name: "reorg4" type: "Convolution" bottom: "conv4_3" top: "reorg4" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 64 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reorg4_relu" type: "ReLU" bottom: "reorg4" top: "reorg4" relu_param { negative_slope: 0 } } layer { name: "concat4" type: "Concat" bottom: "reorg4" bottom: "deconv1_lane" top: "concat4" concat_param { axis: 1 } } layer { name: "reduce2_lane" type: "Convolution" bottom: "concat4" top: "reduce2_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 64 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reduce2_lane_bn" type: "BatchNorm" bottom: "reduce2_lane" top: "reduce2_lane" batch_norm_param { eps: 1e-06 } } layer { name: "reduce2_lane_scale" type: "Scale" bottom: "reduce2_lane" top: "reduce2_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "reduce2_lane_relu" type: "ReLU" bottom: "reduce2_lane" top: "reduce2_lane" relu_param { negative_slope: 0 } } layer { name: "deconv2_lane" type: "Deconvolution" bottom: "reduce2_lane" top: "deconv2_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 32 pad: 0 kernel_size: 2 stride: 2 weight_filler { type: "xavier" } bias_filler { type: "constant" value: 0 } } } layer { name: "deconv2_lane_bn" type: "BatchNorm" bottom: "deconv2_lane" top: "deconv2_lane" batch_norm_param { eps: 1e-06 } } layer { name: "deconv2_lane_scale" type: "Scale" bottom: "deconv2_lane" top: "deconv2_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "deconv2_lane_relu" type: "ReLU" bottom: "deconv2_lane" top: "deconv2_lane" relu_param { negative_slope: 0 } } layer { name: "reorg3" type: "Convolution" bottom: "conv3_3" top: "reorg3" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 32 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reorg3_relu" type: "ReLU" bottom: "reorg3" top: "reorg3" relu_param { negative_slope: 0 } } layer { name: "concat3" type: "Concat" bottom: "reorg3" bottom: "deconv2_lane" top: "concat3" concat_param { axis: 1 } } layer { name: "reduce3_lane" type: "Convolution" bottom: "concat3" top: "reduce3_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 32 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reduce3_lane_bn" type: "BatchNorm" bottom: "reduce3_lane" top: "reduce3_lane" batch_norm_param { eps: 1e-06 } } layer { name: "reduce3_lane_scale" type: "Scale" bottom: "reduce3_lane" top: "reduce3_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "reduce3_lane_relu" type: "ReLU" bottom: "reduce3_lane" top: "reduce3_lane" relu_param { negative_slope: 0 } } layer { name: "deconv3_lane" type: "Deconvolution" bottom: "reduce3_lane" top: "deconv3_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 16 pad: 0 kernel_size: 2 stride: 2 weight_filler { type: "xavier" } bias_filler { type: "constant" value: 0 } } } layer { name: "deconv3_lane_bn" type: "BatchNorm" bottom: "deconv3_lane" top: "deconv3_lane" batch_norm_param { eps: 1e-06 } } layer { name: "deconv3_lane_scale" type: "Scale" bottom: "deconv3_lane" top: "deconv3_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "deconv3_lane_relu" type: "ReLU" bottom: "deconv3_lane" top: "deconv3_lane" relu_param { negative_slope: 0 } } layer { name: "reorg2" type: "Convolution" bottom: "conv2" top: "reorg2" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 16 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reorg2_relu" type: "ReLU" bottom: "reorg2" top: "reorg2" relu_param { negative_slope: 0 } } layer { name: "concat2" type: "Concat" bottom: "reorg2" bottom: "deconv3_lane" top: "concat2" concat_param { axis: 1 } } layer { name: "reduce4_lane" type: "Convolution" bottom: "concat2" top: "reduce4_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 16 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reduce4_lane_bn" type: "BatchNorm" bottom: "reduce4_lane" top: "reduce4_lane" batch_norm_param { eps: 1e-06 } } layer { name: "reduce4_lane_scale" type: "Scale" bottom: "reduce4_lane" top: "reduce4_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "reduce4_lane_relu" type: "ReLU" bottom: "reduce4_lane" top: "reduce4_lane" relu_param { negative_slope: 0 } } layer { name: "deconv4_lane" type: "Deconvolution" bottom: "reduce4_lane" top: "deconv4_lane" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 8 pad: 0 kernel_size: 2 stride: 2 weight_filler { type: "xavier" } bias_filler { type: "constant" value: 0 } } } layer { name: "deconv4_lane_bn" type: "BatchNorm" bottom: "deconv4_lane" top: "deconv4_lane" batch_norm_param { eps: 1e-06 } } layer { name: "deconv4_lane_scale" type: "Scale" bottom: "deconv4_lane" top: "deconv4_lane" scale_param { filler { type: "constant" value: 1 } bias_term: true } } layer { name: "deconv4_lane_relu" type: "ReLU" bottom: "deconv4_lane" top: "deconv4_lane" relu_param { negative_slope: 0 } } layer { name: "reorg1" type: "Convolution" bottom: "conv1" top: "reorg1" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 8 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "reorg1_relu" type: "ReLU" bottom: "reorg1" top: "reorg1" relu_param { negative_slope: 0 } } layer { name: "concat1" type: "Concat" bottom: "reorg1" bottom: "deconv4_lane" top: "concat1" concat_param { axis: 1 } } layer { name: "conv_out" type: "Convolution" bottom: "concat1" top: "conv_out" param { lr_mult: 1 decay_mult: 1 } convolution_param { num_output: 2 bias_term: false pad: 1 kernel_size: 3 stride: 1 weight_filler { type: "gaussian" std: 0.01 } } } layer { name: "softmax" type: "Softmax" bottom: "conv_out" top: "softmax" } I1101 10:54:04.947386 23007 net.cpp:94] Creating Layer input I1101 10:54:04.947398 23007 net.cpp:402] input -> data I1101 10:54:04.947448 23007 net.cpp:144] Setting up input I1101 10:54:04.947463 23007 net.cpp:151] Top shape: 1 480 640 3 (921600) I1101 10:54:04.947468 23007 net.cpp:159] Memory required for data: 3686400 I1101 10:54:04.947482 23007 net.cpp:94] Creating Layer data_perm I1101 10:54:04.947489 23007 net.cpp:428] data_perm <- data I1101 10:54:04.947499 23007 net.cpp:402] data_perm -> data_perm I1101 10:54:04.947644 23007 net.cpp:144] Setting up data_perm I1101 10:54:04.947656 23007 net.cpp:151] Top shape: 1 3 480 640 (921600) I1101 10:54:04.947661 23007 net.cpp:159] Memory required for data: 7372800 I1101 10:54:04.947685 23007 net.cpp:94] Creating Layer scale_data_lane I1101 10:54:04.947690 23007 net.cpp:428] scale_data_lane <- data_perm I1101 10:54:04.947713 23007 net.cpp:402] scale_data_lane -> scale_data_lane I1101 10:54:04.947741 23007 net.cpp:144] Setting up scale_data_lane I1101 10:54:04.947765 23007 net.cpp:151] Top shape: 1 3 480 640 (921600) I1101 10:54:04.947783 23007 net.cpp:159] Memory required for data: 11059200 I1101 10:54:04.947798 23007 net.cpp:94] Creating Layer conv1 I1101 10:54:04.947803 23007 net.cpp:428] conv1 <- scale_data_lane I1101 10:54:04.947814 23007 net.cpp:402] conv1 -> conv1 Segmentation fault (core dumped)

YafeiWangAlice commented 6 years ago

I have already fixed this issue. So I will close it.

xinwf commented 5 years ago

@YafeiWangAlice, hi, how do you fix this problem, where to find the config file "nvblas.conf", I search this file in whole system, I can't find it.

YafeiWangAlice commented 5 years ago

@xinwf

  1. How do I fix this Segmentation fault (core dumped) problem? --I guess this problem should use right gpu and right cuda version.
  2. where to find the config file "nvblas.conf"? --You could set the path of nvblas.conf by using export NVBLAS_CONFIG_FILE=/usr/local/cuda. If you could not find the file in /usr/local/cuda. You may need to create one first.
xinwf commented 5 years ago

@YafeiWangAlice , I have fixed this problem, but another problem CPU Blas library need to be provided still exists, how to solve it? (emm, is this your qq email wangyafei420?)