jfzhang95 / pytorch-video-recognition

PyTorch implemented C3D, R3D, R2Plus1D models for video activity recognition.
MIT License
1.16k stars 250 forks source link

out of memory? #27

Closed goforfar closed 5 years ago

goforfar commented 5 years ago

Hi! I try the code with TITANXp 12Gb, but before it starts training, it reported that "Runtimerror: cuda out of memory". Do your guys meet the same issues? Could anyone give some ideas for me.....Thanks

cantonioupao commented 5 years ago

I had the same issue with one of my projects. Try to follow these steps: 1.Check if you are actually using your GPU and not the CPU

  1. Check if the TitanXp is carrying out the training bu using the "nvidia-smi" command
  2. If everything is working fine, then try reducing the batch size and the number of epochs
goforfar commented 5 years ago

I had the same issue with one of my projects. Try to follow these steps: 1.Check if you are actually using your GPU and not the CPU

  1. Check if the TitanXp is carrying out the training bu using the "nvidia-smi" command
  2. If everything is working fine, then try reducing the batch size and the number of epochs

I have tried your steps but it reported 'out of memory' issues again even with the batch size of 5....... Have u try the code?Could u give some help to me in details

goforfar commented 5 years ago

I had the same issue with one of my projects. Try to follow these steps: 1.Check if you are actually using your GPU and not the CPU

  1. Check if the TitanXp is carrying out the training bu using the "nvidia-smi" command
  2. If everything is working fine, then try reducing the batch size and the number of epochs

And I have no idea which variable cause the memory explosion... the picture is the shot of my try of the code with a small dataset and a batch size of 5....

goforfar commented 5 years ago

IMG_20190506_231625

goforfar commented 5 years ago

It's something wrong with my libraries.I failed to try it with win10 but can run with ubuntu..

Thank @cantonioupao , you give me a lot of advice.