isetbio / isetbio

Tools for modeling image systems engineering in the human visual system front end
MIT License
103 stars 42 forks source link

Artifacts in the default (60 c/deg Nyquist) Marimont-Wandell OTF #384

Closed npcottaris closed 5 years ago

npcottaris commented 6 years ago

In response to one reviewer comment, I am computing a CSF estimate using the Marimont-Wandell OTF. Inspecting the resulting PSF, which is depicted in the left plot in the figure below, I noticed that there was a weird diagonal and blocky structure which looked like sort of an undersampling artifact. Digging though the code, I noticed that the M-W OTF is computed using a Nyquist frequency of only 60 c/deg, which is the default in humanOTF() if the argument fSupport is passed as an empty array, which it is by the calling function opticsCreate->opticsHuman.

I passed a custom fSupport that extends to 120 c/deg and I got the PSF depicted in the right plot, which is less blocky, and actually a bit narrower than the default PSF. I think we need to modify the code so that the default M-W OTF is computed up to 120 c/deg. Any objections?

maxF = 60 c/deg maxF = 120 c/deg

And below are all the PSFs compared. A: Banks, B: Marimont-Wandell, C-G Thibos subjects.

psfs

Contours are always drawn at 5% intervals.

npcottaris commented 6 years ago

I introduced a flag legacyFrequencySupport in opticsCreate() -> opticsHuman(), which is set to true for backwards compatibility, in which case the M-W OTF is computed with a frequency support up to 60 c/deg. If set to ~true, we use a 120 c/deg frequency support and get the more localized PSF shown above.

wandell commented 6 years ago

Thanks - sorry for the delay. Your analysis seems very good. I will think tonight and tomorrow. The 60 cpd limit was mostly because back in the day the size (31 wavelengths and all those samples) was slow. (Today was our Annual Symposium and kind of a big deal.)

npcottaris commented 6 years ago

Yes, I figured that must have been the reason. By the way, we have a similar issue with wvf-based OTFs. There, the default number of spatial samples is too low. I think 101. In our CSF simulations I am using something like 261 samples which results in a frequency resolution of 1 c/deg, and a significantly better sampled OTF.

DavidBrainard commented 6 years ago

There is a deeper question here, which is what is the right principle for defaults. Should we aim for something that is first order right and runs fast, or something that we'd use in a careful calculation? I am fine with upping the resolution as long as it doesn't bring any of our tutorials and validations to a grinding halt. We might try to write down our stance on this and make it clear somewhere on the wiki or in a tutorial where we show how to adjust.

npcottaris commented 6 years ago

Compared to the time it takes to generate a single CSF curve this is a drop in the ocean. And for the paper, even if this might end up is a reviewer figure, I think we need the best accuracy since the effects of different PSFs are at the highest SFs were the signal is weakest. But even for the 3 tutorials affected by this change, their execution time is not brought to a halt.

In general, as you may have noticed, I always opt for quality vs speed. This is especially so since we are trying to get the details right as opposed to other approximate approaches.

But to address your point, this issue is something that we will run to continuously and therefore we should probably address it via a preferences option (which is accessible by all functions no matter how nested they are), something like ‘computeApproach’ with 2 valid values : ‘optimizeForSpeed’ and ‘optimizeForAccuracy’

DavidBrainard commented 6 years ago

Closing this issue because I think we have done something.