Closed wangzhanwei666 closed 2 years ago
It depends on the input size and length of the sequence. I also have a GPU with 32G memory. It should be fine for the VideoLQ dataset. For other videos, you may need to reduce max_seq_len
if you encounter OOM.
Closing this issue due to inactivity. Please reopen the issue if problem persists.
my pc 32g memory,when run inference, the memory exhaust,then the progress killed. how many memory needed?