Closed sajjadafridi closed 4 years ago
I'm afraid that I can't provide any idea since I didn't try to run it in windows environment.
Did you check the value of torch.cuda.is_available()
here?
I'm afraid that I can't provide any idea since I didn't try to run it in windows environment.
Did you check the value of
torch.cuda.is_available()
here?
yes i debug this line of code. it returns true for this line but still, GPU is not utilizing. Is there any other way to handle this?
What GPU are you using? And how long did it cost to process this sample video?
I'm using Nvidia Tesla V100 to process it, which takes 101.25 seconds. If you're not utilizing any GPU, it's impossible to take less than 20 minutes.
What GPU are you using? And how long did it cost to process this sample video?
I'm using Nvidia Tesla V100 to process it, which takes 101.25 seconds. If you're not utilizing any GPU, it's impossible to take less than 20 minutes.
Well, I am using Nvidia-GeForce 1080- ti 11gb on ubuntu it works like a charm but on window os it doesn't show any GPU utilization. I am doing it for live webcam.
╮(╯_╰)╭ No idea for that.
Hi,
I tried the code to run in window. Everything was fine but the main issue i am facing is that the code doesn't utilize the GPU. I tried each and every possible solution but helpless. Any idea?