Closed yangyi02 closed 7 years ago
Maybe this question is similar to the PR, see the comments. The FeatureMapExpandLayer needs to judge whether the getInputGrad(0)
is empty. But if the constant layer is the first input layer and does not need backpropagation, the FeatureMapExpandLayer should also skip the backpropagation according to the code logic.
Yes, I think they are the similar reason. Can you add this check for FeatureMapExpandLayer ?
Thanks.
I close this issue due to inactivities. Please feel free to reopen it if more information is available.
Hi all,
I defined a constant layer in PaddlePaddle by myself and use FeatureMapExpandLayer on top of that for some learning algorithms. The constant layer I defined does not need backpropagation.
When I run the training, PaddlePaddle gives segmentation fault.
The reason I find is because getInputGrad(0) in FeatureMapExpandLayer is empty (because the constant layer does not need backpropagation). And "int imgSize = inGrad->getWidth();" gives segmentation fault.
Below is the backward code in FeatureMapExpandLayer.cpp
96 void FeatureMapExpandLayer::backward(const UpdateCallback& callback) { 97 LOG(INFO) << "I am here"; 98 MatrixPtr inGrad = getInputGrad(0); 99 MatrixPtr outGrad = getOutputGrad(); 100 size_t batchSize = getInput(0).getBatchSize(); 101 int imgSize = inGrad->getWidth(); 102 { 103 AsyncGpuBlock asyncGpuBlock; 104 for (sizet i = 0; i < batchSize; i++) { 105 MatrixPtr outGradTmp = 106 Matrix::create(outGrad->getData() + i imgSize numFilters, 107 numFilters, 108 imgSize, 109 false, 110 useGpu); 111 MatrixPtr inGradTmp = Matrix::create( 112 inGrad->getData() + i imgSize, 1, imgSize, false, useGpu_); 113 inGradTmp->collectBias(outGradTmp, 1); 114 } 115 } 116 / Do derivation / { 117 REGISTER_TIMER_INFO("BpAvtTimer", getName().c_str()); 118 backwardActivation(); 119 } 120 }