Open DJSlane opened 2 years ago
I don't know what this is supposed to achieve. The focal point is already correctly calculated and used in VRPerfkit. Why would you need a manual adjustment?
I happen to see this comment so let me reply too.
I am using the same calculation as you do (like I literally looked at your code and credited it in my project) - and they work great so far.
What I did was adding an offset (https://github.com/mbucchia/OpenXR-Toolkit/commit/83daab805acaa691d9ef5b9ae99427529c1f22a3), which I really did in case for some reason, there's an HMD for which the calculations won't work. So far, no known case of that, and what you did is super solid 👍 (hats off).
I think what OP is missing is that without knowing which eye is being rendered (which was the case he claimed for DCS), then the offset is not useful (because the vertical offset needs to be different for each eye). Which btw, figuring out which eye is which is a pretty fragile process so far for deferred rendering, where I have to track all the CopyResource()
/CopySubresourceRegion()
calls and I am assuming a pattern "begin frame -> render left eye -> CopyResource to left swapchain -> render right eye -> [ CopyResource to right swapchain ] -> end frame". This way I sort of keep track of which eye is being rendered.
It works OK with FS2020 (except for some stuff like the 2D menus). But I expect as more OpenXR games will be release, this logic will suffer. Unlike vrperfkit, I am only realistically supporting 1 title so far... so my job is easier than yours ;)
Cheers!
Well, this is tough to elucidate without showing the view in a canted display such as the Pimax. I want to respect your efforts for the mainstream, which Pimax is not. It goes to our discussion about Pimax before in the other "issue" related to DCS rendering. Not because your software is wrong in calculation I don't think, but because Pimax hacks it with "parallel projection" mode, which I believe is a transform that is after-the-fact.
For clarity of what I understand, which may be incomplete: Many games such as DCS support non-parallel projection mode and the rendering is more efficient. Others, like Elite Dangerous and FS2020 -stubbornly- don't allow it forcing you to use "Parallel projection" mode as a selection prior to starting the game, which costs about 30%-50% performance drop and/or clarity. (To show the same clarity a game has to be rendered at say 3900 lines horizontally to offer the same as 3160 lines. The reasons that I can determine are related to the pin-cusion distortion/barrel correction.)
My perception is that the focal point goes to the outside of each eye. The left eye to the left, the right eye to the right. Visually the regions don't overlap exactly, which is fine at larger radii, but not at smaller radii. For example at a radius of 0.7 it looks the shape of the mastercard symbol on your credit card. I may have to get through the lens footage, but effectively it means I have the center of the overall circles are not lining up fully.
This next part is purely speculative and I may be wrong: Perhaps the focal points ARE perfectly aligned, but the result of the transform is not circular, resulting in the appearance that they resulting in somewhat pear-shaped regions - instead of circles - in which case they can never overlap fully, but would benefit from a shift.
Again I think this is Pimax and canted display specific, and not due to the software calculations per se. In games that support Parallel projection being turned off, such as DCS. the center of the frame worked for the regions and they are indeed circular in the headset and the center of the frame is indeed close to the center of the focal center.
If I haven't already shared it: https://risa2000.github.io/hmdgdb/ shows the differences between "PP" mode and non-PP mode and may help calculations of where they focal overlap/center is, but again I have a strong doubt it has to do with the code.
That's why I thought about a manual horizontal offset. I gave up previously because many games as you explained don't reveal which-eye information, in which case we wouldn't be able to determine which way to move the offset it anyway. When I discussed this with @mbuccia in regards to FS2020 and OpenXR Toolkit, he very graciously entertained my idea and included manual offsets in an experimental portion of software, but alas I've yet to try it as it has not yet been released.
He credited your work fully by the way. Please feel free to put holes in my theories. I don't pretend to know/understand this from anything but a layman/consumer perspective and am using language that is very new to me.
The only reason I opened this as a new issue as it was continuing to be discussed in regards to a different topic and I felt that it would help anyone else seeing the same.
Perhaps my perception is related to something more complex such as IPD. Optics and rendering in VR are plagued with so many transforms and layers that by the time I see anything it's a guess as to what's not aligned. Seeing is believing however, hahah.
whoa. that was serendipity.
Parallel projection should have no effect on the correctness of the focal point calculations, unless Pimax is providing incorrect projection and view matrices to the runtime. The difference boils down to the view matrices pointing straight ahead with parallel projection and a wider FOV, whereas with canted displays they are pointing slightly outwards and have a reduced fov. The calculations take both into account, and it is working fine on my Index in either mode.
You are right, though, that the circles I draw in the textures will not end up being perfectly circular the higher the FOV is, and there is little I can do about it. Correcting the shape of the circles would be rather complicated (unless there's a simple math trick I'm missing), so that's not likely to happen.
No problem. I'm enjoying it very very much as is. I enjoy the technical side too -however poorly I am prepared to understand it- I work in the medical field by trade so I'm way out of my league and took only basic programming in the days of Fortran. If I were rich I'd sincerely buy you each a Pimax 8KX as it is an experience to be held an I can't help but delve into the technicals.
If by some miracle the offsets work form OpenXR+FS2020 would you entertain them at all?
They do work...and a horizontal stretch Oval works too!
Oddly the new 0.3 version doesn't appear to affect DCS. I've treid LRLRLR, RLRLRL, SSSSSS, RLSRLS and something random. - always gives the "mastercard overlap". Right eye is slightly to the right Left eye slightly to the left. Keep in mind this is affecting the Sharpening radius and the FFR in the same areas at the same radius.
Perhaps Pimax is submitting the wrong calculations? They worked correctly in OpenXR so maybe it's their SteamVr data. They goofed up the Oculus implementation, so it wouldn't surprise me if there's more to the Pimax side. Shame on Pimax? @MBucchia was right. He was kind enough to include the offset, and I didn't end up needing it in OpenXR. The circles were perfectly aligned. This was however in a parallel projection game (they're working on a non-PP fix!). In any case, I feel that at high radius values the lack of overlap is less noticeable, and the greatest impact vs. cost is at high radii. If eye-tracking or dynamic foveation was desired, then you would need more precision in the eyes for Pimax, but since Pimax doesn't really have ET....
Could it be that there's a rounding error? 0.17 radians as reported in my log = 9.74028 degrees, but the cant is -10.0° ? Not sure it'd be significant enough for the MasterCard effect....
It still appears as if it is placing the target to the right of where it should in the right eye. Left of where it should in the left eye. As if the center of the frame is defaulted somehow. Log reads: 14:57:53 [612] Raw projection for eye 0: l -1.7, r 1.3, t -1.3, b 1.3 14:57:53 [612] Display is canted by 0.17 RAD 14:57:53 [612] Projection center for eye 0: 0.62, 0.5 14:57:53 [612] Raw projection for eye 1: l -1.3, r 1.7, t -1.3, b 1.3 14:57:53 [612] Display is canted by -0.17 RAD 14:57:53 [612] Projection center for eye 1: 0.38, 0.5
Where: OpenVR is reporting.
Left eye HAM mesh: original vertices: 120, triangles: 40 optimized vertices: 48, n-gons: 4 mesh area: 8.73 %
Left eye to head transformation matrix: [[ 0.984808, 0. , 0.173648, -0.030015], [ 0. , 1. , 0. , 0. ], [-0.173648, 0. , 0.984808, 0. ]]
Left eye raw LRBT values: left: -1.742203 right: 1.255990 bottom: -1.269841 top: 1.269841
Left eye raw FOV: left: -60.14 deg right: 51.47 deg bottom: -51.78 deg top: 51.78 deg horiz.: 111.62 deg vert.: 103.56 deg
Left eye head FOV: left: -70.14 deg right: 41.47 deg bottom: -51.35 deg top: 51.35 deg horiz.: 111.62 deg vert.: 102.70 deg
Right eye HAM mesh: original vertices: 120, triangles: 40 optimized vertices: 48, n-gons: 4 mesh area: 8.73 %
Right eye to head transformation matrix: [[ 0.984808, -0. , -0.173648, 0.030015], [ 0. , 1. , -0. , 0. ], [ 0.173648, 0. , 0.984808, 0. ]]
Right eye raw LRBT values: left: -1.255990 right: 1.742203 bottom: -1.269841 top: 1.269841
Right eye raw FOV: left: -51.47 deg right: 60.14 deg bottom: -51.78 deg top: 51.78 deg horiz.: 111.62 deg vert.: 103.56 deg
Right eye head FOV: left: -41.47 deg right: 70.14 deg bottom: -51.35 deg top: 51.35 deg horiz.: 111.62 deg vert.: 102.70 deg
Total FOV: horizontal: 140.29 deg vertical: 102.70 deg diagonal: 134.74 deg overlap: 82.95 deg
View geometry: left view rotation: -10.0 deg right view rotation: 10.0 deg reported IPD: 60.0 mm
I have to agree, it all seems correct.
10 degrees are 0.1745... radians. The display in the log is rounded to two digits, so it looks correct. Only thing I'm not sure about is that I would have expected the first (left) eye to have the negative value for the canting, not the right eye. I'll have to check my math if that's expected or actually the wrong way round.
Dang, what layer am I missing hahah! I dragged my IPD around to ensure it wasn't that. What other factor could there be? and why doesn't the 6-letter sequence seem to affect it at all?
I don't follow on with the raw vs actual projected "Projection center for eye 0: 0.62, 0.5" and "Projection center for eye 1: 0.38, 0.5". If you simply had the eyes backwards I would think it would be worse than what I'm seeing. I'll post a screen capture after I figure out how to convert .dds to .jpg
Not necessarily. The center is a combination of the FOV and the canted angle. If the canted angle is backwards, it could quite possibly still lean in the right direction, just not far enough. That would match what you said, that the center isn't far enough to the sides.
Ok so this is my left eye. My right eye looks like the circle is just touching the edge to the right of the menu
I switched to large FOV to see if that changed the offset. But then I realized something more....
so the debug doesn't really show where the foveation is applied?
Debug shows where the sharpening/upscaling is applied, which isn't necessarily the same thing. VRS patterns work a little differently.
I'm not sure I can properly debug this without access to a Pimax headset. My Index is canted, as well, but to a smaller extent and with lower FOV, so any imprecision in the calculations is not readily visible.
Yeah. There seems to be a shortage of Pimax with your knowledge and expertise. I pushed in the /Pimax reddit for Pimax to consider @mbucchia and yourself for test units. Your contributions would be directly helpful to the community. Sending each of you a test unit would be easier than teaching me to code hahah! Thanks for taking an interest.
Can I resurrect this thread by asking about vertical offset? From reading the above, I appreciate the focal point and moving the VRS centre is tricky horizontally, but in my headset (varjo aero) fixed foveated appears too far "down". This is across all games and seems to be just the physical geometry of the headset.
I.e. the top edge of the first radius appears closer to the centre of the display than the bottom edge.
I would like to be able to nudge the VRS centre up the image so those radius slide up too, bringing them more in line with the centre of my displays.
This is true for the DLSS plugin for Fallout 4 / Skyrim too, and they provide the means to offset that centre.
I created this issue only to separate from the comment section in which it started, which was becoming off-topic and I wanted to expand the discussion on the original topic, With regard to allowing selection of the foveation position and it's adjustability: mbucchia was very responsive. It was within 2 hours of a comment that it was implemented and demonstrated. It was scary fast. Very cool! He did mention it is VERY different from OpenVR however, so I don't think it can be done in the same way for OpenVR as, again, in some cases it couldn't even be determined which eye was being rendered. mbucchia said the way in which it was determined with FS2020 was with heuristics (an educated guess). An offset would be cool, but experimental and potentially breaking. It would be helpful for Pimax users as the center of the frame is simply way to far from the focal projection.