LIAGM / LFattNet

Attention-based View Selection Networks for Light-field Disparity Estimation
MIT License
54 stars 15 forks source link

Out of Memory? what's the problem? #3

Closed dreamhua82 closed 4 years ago

dreamhua82 commented 4 years ago

Dear Liang, My test PC is like this: one NVIDIA GTX 1080Ti GPU, CPU: intel i5, Memory: 8GB but when I run it , it show : GPU out of memory. Regards,

LIAGM commented 4 years ago

Hi

Could you provide more information of your test PC? If you fit the environment which I describe, this should not be a problem. Thanks

Best Regards,

LIAGM commented 4 years ago

Hi

For this situation, the main problem may be the OS. I also encounter this problem when I use Win10 for training and testing my network with 1080ti. Our model use almost all memory in 1080ti, but Windows system will remain some space not to be used. So the advice I can give to you is to use Ubuntu to be your OS and build the same environment as I described. Sorry for this problem

Best Regards,

LIAGM commented 4 years ago

Hi

I did not test my code in Ubuntu 18.04. So I do not know the problem now is the OS difference between 16.04 and 18.04 or the lack of memory. I think the first step is to change the environment as my setting. And you can check the problem is from OS difference or the memory. Thanks

Best Regards,