jnjaby / DISCNet

Code for DISCNet.
99 stars 16 forks source link

Does the PSF remain the same for all images with a certain OLED display? #7

Closed Jian-danai closed 3 years ago

Jian-danai commented 3 years ago

Hi, I have some questions related to the PSF.

For a certain OLED display, does the PSF remain the same for all images?

Will the PSF change if the OLED screen emits light? Or is the PSF only affected by the pixel layout of the OLED display?

And, could you provide the derive or the origin of the light propagation model (I mean the Fresnel Propagation)

Thanks.

jnjaby commented 3 years ago

Hi, the shape of PSF is mainly determined by the pixel layout of the OLED. However, the assembly of UDC system will also affect the PSF, e.g., rotating around optical axis, tilt, shift. Therefore, a precise statement should be the PSF remain the same for a certain UDC imaging system (if we don't consider spatially-variant kernel). The light propagation model can be found in Introduction to Fourier optics, a book by Joseph W. Goodman.

Jian-danai commented 3 years ago

Hi, the shape of PSF is mainly determined by the pixel layout of the OLED. However, the assembly of UDC system will also affect the PSF, e.g., rotating around optical axis, tilt, shift. Therefore, a precise statement should be the PSF remain the same for a certain UDC imaging system (if we don't consider spatially-variant kernel). The light propagation model can be found in Introduction to Fourier optics, a book by Joseph W. Goodman.

Thanks so much!! But I was also wondering whether the PSF will change if the OLED screen emits light? (Because maybe the light emitted from the OLED will affect the light that propagates through the OLED display?) In your UDC system (when you measure the real PSF of the ZTE display), did you consider this factor?

And, do you have the code implementation to simulate the PSF? (if transmission function t(p,q) is available). For example, how do you achieve Figure 1 (Comparison of simulated and real-measured PSF) in your supplementary material, did you release the code? Thanks!!

jnjaby commented 3 years ago

For the first question, an active OLED screen could affect the light modulation. However, in real scenarios, all the displays can be set as non-active and turned off when cameras are in operation, since it won't affect user experience for setting black on small regions when taking selfies. Figure 1 in supplement is an illustration of the gap between simulated and real-captured PSF of an optical system. I am afraid we can't release the code of PSF simulation due to the policy of our collaborator.

Jian-danai commented 3 years ago

For the first question, an active OLED screen could affect the light modulation. However, in real scenarios, all the displays can be set as non-active and turned off when cameras are in operation, since it won't affect user experience for setting black on small regions when taking selfies. Figure 1 in supplement is an illustration of the gap between simulated and real-captured PSF of an optical system. I am afraid we can't release the code of PSF simulation due to the policy of our collaborator.

Thanks a lot!!

Jian-danai commented 3 years ago

Hi, I find that with a fixed pixel layout(a fixed display transmission function), a fixed d, z2, and the focal length f, the PSF is still at least a function of (1) wavelength (lambda), (2) the distance between the object and the OLED display (z1), (3) (p,q). These 3 factors seem not easy to be controlled during capturing.

So, how did you control these factors and applied the real-measured PSF to the HDRI Havens' panorama? Or it's just an approximation?

jnjaby commented 3 years ago

Basically, it's an approximation of PSF given a particular light source at a specific location.

  1. wavelength. The equation expresses a PSF of light with a single wavelength, while the real PSF is measured by the RGB sensor and captures the full visible spectrum. So in terms of the wavelength, a single-wavelength and RGB (full-spectrum) PSF can be connected by a spectral response function (SRF).
  2. distance between the object and the OLED display. This involves depth dependence of PSF. Commonly the depth of objects has no significant impact when it's in focus. Hence we use the PSF of a light source at particular distance (depth).
  3. spatial variance. We assume a spatially-invariant kernel in this work to simplify the problem. In real images, the spatial variation of PSF is noticble and should be taken into account. We leave it for our future work.
Jian-danai commented 3 years ago

Basically, it's an approximation of PSF given a particular light source at a specific location.

  1. wavelength. The equation expresses a PSF of light with a single wavelength, while the real PSF is measured by the RGB sensor and captures the full visible spectrum. So in terms of the wavelength, a single-wavelength and RGB (full-spectrum) PSF can be connected by a spectral response function (SRF).
  2. distance between the object and the OLED display. This involves depth dependence of PSF. Commonly the depth of objects has no significant impact when it's in focus. Hence we use the PSF of a light source at particular distance (depth).
  3. spatial variance. We assume a spatially-invariant kernel in this work to simplify the problem. In real images, the spatial variation of PSF is noticble and should be taken into account. We leave it for our future work.

Thanks!