Open Lewis0427 opened 2 years ago
Hi, you can convert the images into grayscale and use that for your experiments/metrics.
Hi, you can convert the images into grayscale and use that for your experiments/metrics.
Thanks for your reply, In your UGSR work, I noticed that the difference between the x8 HR(640x512) you provided and the original graph of the dataset. Is this difference caused by the alignment?Can you tell me how the HR is generated? Thanks a lot!
@Ziyang6 Can you specify the dataset and figure you are referring to?
@Ziyang6 Can you specify the dataset and figure you are referring to?
Sure. For example, in UGSR, there are differences in the details of your x8 HR images (FLIR_09861) and the original FLIR dataset(FLIR_09861). Your HR images are smoother & blurrier, we would like to know if this difference is due to alignment, and can you tell how to make such HR images, we would like to follow the example of making alignment datasets, and can make a fair comparison of training effects. This is very important to our work and we hope you will support us, thanks again!
@Ziyang6 Can you specify the dataset and figure you are referring to?
The difference as represented in the figure
Yes, the difference is due to (1) interpolation after performing alignment and (2) Smoothening to reduce sensor noise of the thermal image. As mentioned in the paper, we performed a one-time calibration and manually aligned as set of the thermal and visible images. This changes the field of view (a bit of zoom effect) - leading to the smooth effect. The other is we used a blur kernel of \sigma = 0.5 to reduce sensor noise. The blurring operation is the same as the blur-downsample model used to simulate the low res images.
Yes, the difference is due to (1) interpolation after performing alignment and (2) Smoothening to reduce sensor noise of the thermal image. As mentioned in the paper, we performed a one-time calibration and manually aligned as set of the thermal and visible images. This changes the field of view (a bit of zoom effect) - leading to the smooth effect. The other is we used a blur kernel of \sigma = 0.5 to reduce sensor noise. The blurring operation is the same as the blur-downsample model used to simulate the low res images.
Thanks for your answer!For simulated low-resolution thermography, is this done on the basis of generating such HR images as you described above? When generating X4 HR (320x256) do you only use nearest downsampling for X8 HR (640x512) without blurring operation?Thank you for your patience!
Yes, the difference is due to (1) interpolation after performing alignment and (2) Smoothening to reduce sensor noise of the thermal image. As mentioned in the paper, we performed a one-time calibration and manually aligned as set of the thermal and visible images. This changes the field of view (a bit of zoom effect) - leading to the smooth effect. The other is we used a blur kernel of \sigma = 0.5 to reduce sensor noise. The blurring operation is the same as the blur-downsample model used to simulate the low res images.
And what is the size of the fuzzy kernel you just mentioned (sig=0.5)?
It would be a Gaussian kernel of size 5x5.
It would be a Gaussian kernel of size 5x5.
We have done the relevant processing according to the parameter settings you gave us, but the processing results are still different from yours, can you provide the code related to the alignment and fuzzy processing operations for generating HR and LR?
Hello Author. We are interested in your work, and noticed from your UGSR work that you use a thermal infrared image dataset (FLIR) with three channels, but the super-resolution results are for a single channel. Can you tell us how the PSNR is calculated?