TimoSaemann / caffe-segnet-cudnn5

This repository was a fork of BVLC/caffe and includes the upsample, bn, dense_image_data and softmax_with_loss (with class weighting) layers of caffe-segnet (https://github.com/alexgkendall/caffe-segnet) to run SegNet with cuDNN version 5.
Other
176 stars 127 forks source link

Caffe not using GPU even though Caffe::set_mode(Caffe::GPU) is executed #8

Closed ghost closed 7 years ago

ghost commented 7 years ago

When I tried to my C++ program with caffe-segnet-cudnn-5, the inference speed is pretty slow even though Caffe::GPU is set, which is around 2 seconds per image. It is the same speed when only CPU is used (Caffe:CPU is set). This problem only happens for caffe-segnet-cudnn-5. It does not happen for the original caffe-segnet with cudnn version 2.

The caffe-segnet-cudnn-5 is compiled with CUDA 8 with CUDNN 5.1..

Your system configuration

Operating system: Ubuntu 16.04 GPU: NVIDIA GTX 1060 CUDA version (if applicable): 8.0 CUDNN version (if applicable): 5.1 BLAS: OpenBLAS

TimoSaemann commented 7 years ago

GPU mode works for me. Did you try the test_segmentation binary? There is a time measurement included. Please clone the updated caffe-segnet-cudnn5.

cd {CAFFE-SEGNET-CUDNN5_ROOT}/build/examples/SegNet_with_C++

./test segmentation with the following arguments: segnet_model_driving_webdemo.prototxt segnet_iter_30000_timo.caffemodel path to an example image camvid12.png

After that you should be able to see the processing time and the output image. Please note that the processing time of the first image is nearly twice as long as the subsequent images will need, because of init reasons, but you should see if GPU mode is active or not.

ghost commented 7 years ago

Thanks very much for the help. I tried you example program and it indeed have run correctly with GPU. I will check detailedly what are the problems with my own program.