I have a video I want to try to run inference on, it's a 50s video at 60 fps, so 3000 frames in total.
I try to run inference with my RTX3090 but I can't get past the 100 frames, otherwise I automatically get a CUDA OOM message.
I was wondering if there's anything I can do to complete a full video inference on an RTX3090 or whether I'm limited to running it at a 100 frames batches.
I have a video I want to try to run inference on, it's a 50s video at 60 fps, so 3000 frames in total. I try to run inference with my RTX3090 but I can't get past the 100 frames, otherwise I automatically get a CUDA OOM message. I was wondering if there's anything I can do to complete a full video inference on an RTX3090 or whether I'm limited to running it at a 100 frames batches.
Thanks