Closed xiaolong-217 closed 1 year ago
Thank you for pointing out the issue. We have updated the code to use specific data loading function for testing, iterating for the same number of times as the number of images to be enhanced, which avoids redundant iterations.
I will temporally close this issue, you are welcome to reopen it if you have any further questions!
I found a problem when I was testing it. If the number of images in dataset B is larger than the number of images in dataset A, the model will run the number of times the number of dataset B. Then I checked the code and found that it is line 85 inside the unaligned_dataset.py, which returns the length of the dataset as the maximum of both dataset A and dataset B. Shouldn't the size of dataset A be returned when testing here to avoid repeating the math multiple times when the number of images in dataset B is larger than the number of images in dataset A Lastly, thanks for sharing the code!