Closed kulkarnikeerti closed 2 years ago
Also, just FYI. If I run the same model in a server which has one GPU it runs without any issues. Do you have any idea if this has something to do with the multiple GPUs in the server.
Okay so I was able to get this issue resolved by updating the pytorch version.
However, I do have one more situation related to the GPU. When the model evaluates the after eval_epochs
it does not use GPU and makes use of CPU. Is there any particular reason to not use the GPU for evaluation part?
I was able to run the model in server. The issue was with the pytorch version.
Hi @yjh0410
Its actually not an issue, but just wanted some information. I am actually training this model in server which has two GPUs. I want to use one of them, so I set it up with
cuda:0
but when I train, the model runs oncpu
. Do you know anything which I could try?Thanks