Open simanw304 opened 1 year ago
I tried to run inference for video captioning on 1 A100 but got OOM issues. Does the inference need to run on 8 A100 or can it run one one A100? Thanks in advance.
Hello @Roleone123 and @MAGAer13 Is it possible to do inference of mPLUG-2 model for video captioning on multiple GPUs?
I tried to run inference for video captioning on 1 A100 but got OOM issues. Does the inference need to run on 8 A100 or can it run one one A100? Thanks in advance.