PaddlePaddle / Paddle

PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)
http://www.paddlepaddle.org/
Apache License 2.0
22.18k stars 5.57k forks source link

Backward error for FeatureMapExpandLayer in PaddlePaddle #1220

Closed yangyi02 closed 7 years ago

yangyi02 commented 7 years ago

Hi all,

I defined a constant layer in PaddlePaddle by myself and use FeatureMapExpandLayer on top of that for some learning algorithms. The constant layer I defined does not need backpropagation.

When I run the training, PaddlePaddle gives segmentation fault.

The reason I find is because getInputGrad(0) in FeatureMapExpandLayer is empty (because the constant layer does not need backpropagation). And "int imgSize = inGrad->getWidth();" gives segmentation fault.

Below is the backward code in FeatureMapExpandLayer.cpp

96 void FeatureMapExpandLayer::backward(const UpdateCallback& callback) { 97 LOG(INFO) << "I am here"; 98 MatrixPtr inGrad = getInputGrad(0); 99 MatrixPtr outGrad = getOutputGrad(); 100 size_t batchSize = getInput(0).getBatchSize(); 101 int imgSize = inGrad->getWidth(); 102 { 103 AsyncGpuBlock asyncGpuBlock; 104 for (sizet i = 0; i < batchSize; i++) { 105 MatrixPtr outGradTmp = 106 Matrix::create(outGrad->getData() + i imgSize numFilters, 107 numFilters, 108 imgSize, 109 false, 110 useGpu); 111 MatrixPtr inGradTmp = Matrix::create( 112 inGrad->getData() + i imgSize, 1, imgSize, false, useGpu_); 113 inGradTmp->collectBias(outGradTmp, 1); 114 } 115 } 116 / Do derivation / { 117 REGISTER_TIMER_INFO("BpAvtTimer", getName().c_str()); 118 backwardActivation(); 119 } 120 }

qingqing01 commented 7 years ago

Maybe this question is similar to the PR, see the comments. The FeatureMapExpandLayer needs to judge whether the getInputGrad(0) is empty. But if the constant layer is the first input layer and does not need backpropagation, the FeatureMapExpandLayer should also skip the backpropagation according to the code logic.

yangyi02 commented 7 years ago

Yes, I think they are the similar reason. Can you add this check for FeatureMapExpandLayer ?

Thanks.

lcy-seso commented 7 years ago

I close this issue due to inactivities. Please feel free to reopen it if more information is available.