Closed Pololinger closed 3 years ago
This neural-style does not use pytorch at all, but (the original) torch using Lua language, not python. Pytorch did not exist back in 2015.
Anyway, you could check with nvidia-smi what other processes are using cuda and how much memory they use.
Hello there,
I'm new to Pytorch and noticed that I keep running out of memory. Now I have 8 GB total capacity but Pytorch is only reserving about 6 GB. According to
torch.cuda.is_available()
8GB should also be available. But the problem persists even though I runtorch.cuda.empty_cache()
Could this be solved by setting
.cuda(non_blocking = True)
? Or changing the number of workers forDataloader
? I was trying to do those changes but couldn't really figure out where. Help is very much appreciated.