Closed nitish11 closed 8 years ago
LuaJIT is faster than Python and Torch is generally faster than Caffe from what I've seen in benchmarks, which are a bit outdated: https://github.com/soumith/convnet-benchmarks. I hear Nvidia gives the best support to Torch, which they use for much of their own work (e.g., autonomous car demonstrations), as others like Google and Nervana/Intel compete on hardware. cuDNN 5 speeds up Torch quite a bit: https://devblogs.nvidia.com/parallelforall/optimizing-recurrent-neural-networks-cudnn-5/.
@adam-erickson : :+1: thanks.
But, the huge difference in only one frame is the concern. The benchmark has not compared LuaJIT with Caffe.
Hi @nitish11 ,
Thanks for your interest in our work and thanks for this cool repository: https://github.com/nitish11/GenderRecognition
It's embarrassing to admit it, but I never worked with Torch and I really can't say which one is faster (or in this case - why you get faster run-times in Torch).
Best, Gil.
I checked torch and Caffe computation engine called BLAS.
In Ubuntu 14.04,
ldd /home/nitish/caffe/build/lib/libcaffe.so
ldd /home/nitish/torch/install/lib/libTH.so
From the output, I observed that torch is linked against openblas, and caffe is linked against libcblas, which might be the reason for slower Caffe.
Solution : Build Caffe with OpenBlas
Are you sure it's not simply the difference in looping speed between LuaJIT and Python? It can be quite large. Similar to Julia, LuaJIT is closer to C.
I am not sure about looping speed between LuaJIT and Python.
Hi,
I am using the gender detection model in Torch and in Caffe for detection from live camera.
Running the code on CPU and keeping the same models file, I am getting different prediction times. For Caffe, it is ~1.30 seconds per frame. For Torch, it is ~0.45 seconds per frame.
What could be the possible reason for the time difference? Is Torch faster than Caffe?