Zheng222 / IMDN

Lightweight Image Super-Resolution with Information Multi-distillation Network (ACM MM 2019, Winner Award of ICCVW AIM 2019 Constrained SR Track1&Track2)
418 stars 69 forks source link

The running time of model #4

Closed minghongli233 closed 4 years ago

minghongli233 commented 4 years ago

In your paper, the figure 9 shows that the speed of IDN is slower than the CARN and IMDN on Set5 for x4 SR. However, I use the official code such as IDN, CARN and IMDN to evaluate the average inference time on Set5 x4 dataset. The average running time of these methods are 0.007s, 0.028s and 0.029s, respectively. I find that the speed of IDN is faster than CARN and IMDN, and the running time of IMDN is close to CARN. I an confused about the result. Could you tell me the reasons? My operation environment is as follows:

GPU: GTX 1080Ti OS: Ubuntu 18.02 LTS CUDA: 10.0 CUDNN: 7.4 Python version: 3.6 pytorch version: 1.0

Thank you!

Zheng222 commented 4 years ago

@minghongli233 Hello, In Figure 9, we evaluated the testing time by using the official code, which has illustrated in the caption of Figure 9. Note that IDN employed the Caffe package. Did you use the warm-up when testing inference time? If you use time.time() in Python to get the testing time, this operation is very important.

Zheng222 commented 4 years ago

@minghongli233 The inference times of EDSR-baseline, CARN, and IMDN are tested by time.time() with warm-up. I recommend that you test them using the test_IMDN.py, which uses a more accurate testing method. Please check the pressure test in README.md.

minghongli233 commented 4 years ago

@Zheng222 Thank you for your reply. I evaluated the running time of IDN by using the Caffe package. If the warm-up is to point the first test will spend extra time , I have done what you said. The results are close to the last time. So, I doubt that the operation environment possibly impact the speed. Could you tell me the operation environment of your test?

Zheng222 commented 4 years ago

@minghongli233

OS: Ubuntu 16.04
Pytorch: 1.0
GPU: GTX 1080Ti
CUDA: 9.0
CUDNN: 7.4

You can print the testing time of each image and you will find that the first image spends much more time than others. The warm-up simply adds an image into the test dataset and calculate the mean of running time except for the first value.

minghongli233 commented 4 years ago

@Zheng222 I try to use the other operation environment as fllows : OS: Ubuntu 18.04 Pytorch: 0.4 GPU: GTX 1080Ti CUDA: 8.0 CUDNN: 5.1 The average running time of CARN is 0.006s. Now, the average inference time is close to the report in your paper ! The operation environment does impact the performance! Thank you for your answer!