Closed habemus-papadum closed 6 years ago
Hi Nehal,
Sorry for the late answer, I have been smashed at work lately.
Seems like you are headed into very deep stuff which touches light fields and spectral rendering.
If the VR headset could recreate this precise distribution of light energy, then it seems reasonable to assume it could create a nearly perfect visual recreation of the scene for the user of the headset.
Yes but the could is highly hypothetical because you need to produce a display that would be able to emit the complete lightfield incoming at the eye.
Now in reality the VR headset does not have enough degrees of freedom to recreate the energy profile at all wavelengths
Indeed! Most displays are using LCD or OLED which are simply not able to model daylight spectral composition. They are able to generate a metamer but nothing like the real spectra. Not only that but afaik no consumer available VR headset are able to model light angular direction faithfully, i.e. the lightfield. From what I read here and there, the Vive and Rift are collimated, which makes sense because that usually what fresnel lenses are used for and thus not able to reproduce faithfully the light incoming from any object under 10 meters or so.
and may not have enough power to recreate the intensity of a bright sunny day.
And will never be for safety reasons! :)
This, it seems, is where colour models come into play
Absolutely but to be fair the Colour Appearance Modeling research for Virtual Reality is thin, not much to read, e.g. [1], [2] and look at what cited them in Google Scholar. Not only that but Colour Appearance Modeling for HDR is in the same current sorry state.
What is the absolute intensity of the light that is emitted (as function of wavelength and direction.)
Impossible to say without accounting for all the colour transformation happening into your display chain and knowing the display characteristics itself. You can make educated guesses but usually if you want to know what is happening, you put a spectrometer against the display and measure the light it emits.
Hope this helps :)
One great feature of
colour
is the wealth of reference data that is provides.I was wondering if it would be possible to provide a model of absolute intensity values (as of function of wavelength and direction) for the pixels found in common consumer devices (e.g iphone 8) or even baseline models for a "typical" computer monitor pixel, "typical" phone, etc. My background is mostly in the sciences, and I find colour both beautiful and confusing, so I will clarify that by "intensity" I mean: When the model is integrated over a range of frequencies and a concrete solid angle, the result would have units of Joules or (even photons/sec) )
I realize this request is under specified/unclear/impractical in many respects but hopefully I can clarify my objectives and eventually make my request more precise.
I am interested in exploring colour models in the context of VR headsets. In a stylized model of a VR headset, the developer has control of every photon that enters the viewer's eye.
Imagine a model of a virtual scene depicting a day at the beach: one could model the # of photons of a given wavelength impinging a patch of the virtual viewer's retina using only "physical" attributes of the virtual scene (e.g. the water is H20, the sand is silicon oxide, sun has temperature of 6000K, etc) as oppose to using concepts of colour (the water is blue, sand is white, sunlight is yellow, etc). If the VR headset could recreate this precise distribution of light energy, then it seems reasonable to assume it could create a nearly perfect visual recreation of the scene for the user of the headset.
Now in reality the VR headset does not have enough degrees of freedom to recreate the energy profile at all wavelengths, and may not have enough power to recreate the intensity of a bright sunny day. The discrepancy between the energy profile created by VR headset and the one implied by the virtual scene should be minimized according to some perceptual measure. This, it seems, is where colour models come into play, except that at first blush, it is not completely clear how to convert a model of "colour perception" into a perception model that utilizes absolute units (Joules) -- it could be easy or hard, I just don't know enough about the precise structure of colour models and human perception.
But, before tackling that issue, I would also need to a model of pixels. This seems more easy to describe. Imagine my device has a brightness setting for the display that ranges from 0 to 1, and I can disable any automatic color calibration the operating system provides. Then if I set the display brightness to 1, all pixels to (0,0,0) save for one which is set (1,0,0), what is the absolute intensity of the light that is emitted (as function of wavelength and direction.) How does this intensity vary as I scale the pixel value from (1,0,0) to (0, 0, 0) (linearly, quadratically, some other power, logarithmically, etc) and ditto for how the intensity varies with brightness dial. I imagine this varies greatly from model to model and perhaps even from unit to unit, in which case I would be interested in rough orders of magnitude for typical devices. I would also be interested in how I could make these measurements on my own (even at a conceptual level).
Hope this is all reasonably clear...
Cheers, nehal