Open LeeBY68 opened 4 hours ago
I cannot run inference on a video in both 4*4090 GPU (24 GB) or 4*L40s GPU (48 GB). Could authors provide the GPU requirenment for the inference as well as training?
Hi, we use A100 80G for inference. Some techniques (#9) can be used to reduce memory usage.
I cannot run inference on a video in both 4*4090 GPU (24 GB) or 4*L40s GPU (48 GB). Could authors provide the GPU requirenment for the inference as well as training?