Closed hangxu124 closed 3 years ago
Hi, I just found that in your conv layer in: https://github.com/yonsei-icsl/nebula/blob/7a46adb8cfd6aa819a140ab2f912210aa01e2f1b/models/layers/convolutional_layer.cu#L32 you call the function "cudnnConvolutionForward" and use alpha = beta =1... I found some other code, they always set alpha=1 and beta=0, why you set beta =1 here? Thanks a lot !
According to the cuDNN documentation, beta is used to scale the output tensor according to the cuDNN documentation So we set beta = 1.
Hi, I just found that in your conv layer in: https://github.com/yonsei-icsl/nebula/blob/7a46adb8cfd6aa819a140ab2f912210aa01e2f1b/models/layers/convolutional_layer.cu#L32 you call the function "cudnnConvolutionForward" and use alpha = beta =1... I found some other code, they always set alpha=1 and beta=0, why you set beta =1 here? Thanks a lot !