lmb-freiburg / flownet2

FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
https://lmb.informatik.uni-freiburg.de/Publications/2017/IMKDB17/
Other
1k stars 318 forks source link

Is it possible to run inference on GPU with 1GB DRAM? #2

Closed alexlyzhov closed 7 years ago

alexlyzhov commented 7 years ago

I am still able to run FlowNet2-CS but FlowNet2-CSS and FlowNet2 fail with "Check failed: error == cudaSuccess (2 vs. 0) out of memory". When I query free memory with cudaMemGetInfo() I can see 950MB free before I run run-flownet.py and FlowNet2 weights occupy only 650MB.

Can it still be possible to fit the model into memory with some tricks?

nikolausmayer commented 7 years ago

Hi nikkou,

the straightforward way to use less memory is to feed in smaller images, but that is probably not what you want. It's possible to reduce the raw memory requirements of the larger networks, but not very easy.

Best, Nikolaus