Closed Jambalaya11 closed 5 years ago
12G memory should be fine to run the test on Vid4 or SPMCS dataset for F7 model. However, if not, you can set --chop_forward True to run on lower GPU consumption
12G memory should be fine to run the test on Vid4 or SPMCS dataset for F7 model. However, if not, you can set --chop_forward True to run on lower GPU consumption
Thanks a lot
the memory of my GPU is 8G-12G, when we train ,the patch_size is 64*64, the gpu memory is eough. However when testing, the test data Vid4 and SPMCS is larger than vimeo-90k, when I test model, set the testBatchsize=1, but still CUDA out of memory. If I what to recurrent the score in paper, how to test on dataset Vid4 or SPMCS? Thanks~