Closed yhalk closed 9 years ago
@yhalk Unfortunately, currently we do not support mixed backend computing because data transfer between CPU and GPU is actually quite costly. And currently the GPU backend requires the cuDNN library even if cuDNN convolution is not used in the network. Maybe you could try to use CPU backend with native extension, which might be a little bit faster than the default julia backend depending on your network.
Hello,
I want to use Mocha with a GPU backend, however the CUDA version that is installed on the system is 5.0 which does not support cuDNN. I have seen this issue https://github.com/pluskid/Mocha.jl/issues/52 and the only problem with my network is that it uses sigmoid neurons and a softmax layer, which need cuDNN.Is there a workaround to make it work even if computation with sigmoids and softmax becomes less efficient, eg make it use a CPU backend for those?
Thanks in advance for the help!