Open JacExe opened 3 years ago
Thanks! Stereo-QuadBuffer - you mean Direct3D 11.1 native S3D which implemented in Unity as "Non HMD"? I don't found it via the new XR plugin management system but able to enable it via editor script in Unity2020. I already made a version of my S3D system with the "Non HMD" and get their S3D settings to working correctly using my system. But anyway D3D 11.1 S3D have problems - GUI shows only in one eye(I solved this) and not working correctly with URP, HDRP, and Linear color space which also was mentioned here - https://docs.unity3d.com/560/Documentation/Manual/StereoscopicRendering.html The advantage that native Direct3D 11.1 S3D working out of the box in Win8.1-10 and should work with any monitor supported S3D but passive S3D render resolution not optimized and as a result, performance and anti-aliasing not looking good unlike in my S3D system. So it useful only for the active S3D method. I don't like blinking and eye separation per time(lag), the interlaced method much simpler, performant, healthy, same cable bandwidth as mono, both eye images at a time(no lag), light simple glasses, so I decided not to publish it by default. But if you want it I can make a branch of the S3D system version with D3D 11.1 native S3D.
Thanks for your quick reply. I would be curious to understand how to restore S3D Direct3D 11.1 support via editor script. I agree with you on the advantages of your system, but for certain types of installations it makes sense to use active stereo, so I ask you if you can create a branch with this functionality, I would be grateful.
Ok, I will add the branch. Put this file in the "Assets\Editor" folder and you can enable it via the Unity Editor menu https://drive.google.com/file/d/1gtxmpmPIy0JviKGBKDusbVGvIP7kqMOY/view?usp=sharing
Ok, I will add the branch. Put this file in the "Assets\Editor" folder and you can enable it via the Unity Editor menu https://drive.google.com/file/d/1gtxmpmPIy0JviKGBKDusbVGvIP7kqMOY/view?usp=sharing
Easier than I thought, thanks !
I built the demo with D3D11.1 - https://drive.google.com/file/d/1GcSBspZIISI0UMH745li9e0Xp2iZbxyQ/view?usp=sharing Don't forget to enable the S3D driver in Windows and I recommend setting the S3D On/Off hotkey to ScrollLock to sync with the same function in Demo to correctly disabling D3D11.1 S3D by one hotkey when required. In Linear color space I have white-out, try both Linear and Gamma, is it working with your Active S3D setup?
Hi Vital, thank you for the excellent support, yesterday I was able to test the two demos you sent me. I have tested them on two different systems.
System A: NVIDIA Quadro RTX 4000 - DirectX 12 - Stereo - Display Mode: nView Clone mode System B: NVIDIA Quadro M5000 - DirectX 12 - Stereo - Display Mode: generic active stereo
The two demos work correctly on both sistems, I confirm that the linear version is white-out.
Aftar that, I created a tiny demo project from scratch to check how it behaves with the native stereo support with Unity 2020.3.0f1, simply by re-enabling 3D stereo support with the script you kindly passed me and nothing else.
Result > The build goes in stereo mode only on System B.
I suspect the problem may be the Nvidia Clone Mode, but I am wondering instead why your build is ok on that system, Maybe I'm missing somethings?
N.B. Unity 2019 with Stereo (Non HMD) still works on both systems
Hi XR-Jaco, I think because I using 2 cameras. With one camera by default, GUI not visible in the left view and shadows stays mono in Unity2018. Also, default separation/convergence settings are incorrect and it always symmetric, so you can't make an unsymmetric camera shift(eye priority important for sight aiming) with one camera S3D setup. I can easily make correct unsymmetric settings even with one camera via a stereo projection matrix set, but "Post Processing Stack V2" reset matrices, so I found a solution formula to make the same correct settings while handling via separation/convergence. I made 2 additional branches, this one only with D3D11.1 - https://github.com/Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/tree/Direct3D11.1 Check it.
Hi Vital-Volkov, thanks for sharing the D3D11.1 branch.
I know this issue is not related to your library, but I am reporting it for completeness. I confirm that Unity 2020.3 with VR SDK does not works with Nvidia clone mode, instead works in Unity 2019.3.
Hi XR-Jaco, So, you confirm that the Nvidia clone mode works only with two cameras in Unity2020(like in my demo)? Yes, VR SDK has bugs, in Unity2018 shaders works correctly with 2 cameras, but in Unity2019 HDRP vise versa. But anyway It should be returned in the Unity menu as a native S3D.
I wonder if switching between the d3d stereo and None HMD modes at the runtime is possible?
I wonder if switching between the d3d stereo and None HMD modes at the runtime is possible? "None HMD" is a Unity's support of d3d 11 native stereo3D and can't be completely disabled at runtime.
Hi there, I am new to Stereo3D, so apologies if this is a repeat question. I'm currently using an LTS version of Unity 2020 (2020.3.16f1). The editor script enables the Stereo3D, as there is no option in the XR plugin management in player settings like for 2019 and earlier. Would the Stereo3D script then be attached to the camera? Are there any other steps required for the setup for active Stereo? Thanks in advance.
Hi there, I am new to Stereo3D, so apologies if this is a repeat question. I'm currently using an LTS version of Unity 2020 (2020.3.16f1). The editor script enables the Stereo3D, as there is no option in the XR plugin management in player settings like for 2019 and earlier. Would the Stereo3D script then be attached to the camera? Are there any other steps required for the setup for active Stereo? Thanks in advance.
Hi, Yes, the Stereo3D script must be attached to the camera for the S3D correct settings system to work. If you build your project with enabled XR plugin None HMD then Unity engine will use native D3D11 Stereo3D for Win 8.1+ and if you have active Stereo3D setup and Stereo3D enabled in Windows(video driver settings) it should work in Unity Player(not in Editor).
Great! Thanks for the fast response and for producing this very useful repo!
Hi Vital,
First of all, thank you for publishing this great repo, I am a Robotics Ph.D. student, and I have 2 questions for you: 1) Can I use your work for a scientific publication? How do I credit your work? Do you have preferences on how should I cite your work? 2) I have tested Stereo3D on widows with a Gefore RTX 980 Ti with Unity 2019 and a Sony 3D TV with active glasses. I need however to run it with the same hardware on Unity 5.6.2 on Ubuntu 16.0 Linux, the reason is that I have a Robot that interfaces with Unity 5.6.2 on Linux and I can not run it on windows. Linux uses OpenGL and not DirectX. The "Quad" shader you wrote has problems with Unity 5.6.2(both on Windows and Linux) and I`m not sure if it can support OpenGL. I do not know much about shaders. Can you help me with how can I get your shader to be compatible with Unity 5.6.2 and OpenGL?
Thanks
Hi Vital,
First of all, thank you for publishing this great repo, I am a Robotics Ph.D. student, and I have 2 questions for you: 1) Can I use your work for a scientific publication? How do I credit your work? Do you have preferences on how should I cite your work? 2) I have tested Stereo3D on widows with a Gefore RTX 980 Ti with Unity 2019 and a Sony 3D TV with active glasses. I need however to run it with the same hardware on Unity 5.6.2 on Ubuntu 16.0 Linux, the reason is that I have a Robot that interfaces with Unity 5.6.2 on Linux and I can not run it on windows. Linux uses OpenGL and not DirectX. The "Quad" shader you wrote has problems with Unity 5.6.2(both on Windows and Linux) and I`m not sure if it can support OpenGL. I do not know much about shaders. Can you help me with how can I get your shader to be compatible with Unity 5.6.2 and OpenGL?
Thanks
Hi naveed1366, 1) Fell free to use my work, I will glad if it give some dividends ;) 2) Why only Unity 5.6.2 - Expensive to upgrade project? How you get active S3D on Linux? Yes, I can look into 5.6.2 and try to help you. Where are you from, what institute and where can I see your project?
Dear Vital, Thank you so much for this very useful repo. It worked perfectly for an academic app I developed some time ago. My actual project involves using Unity's Cinemachine - an asset that manages multiple cams and eases working with multiple cams. The problem is that it does not use conventional cameras, but "virtual" ones. Thus, if i attach your script to all virtual cams it does not work, and if i attach to the CineMachineBrain (the object that manages all virtual cameras), it works for a single cam, but it does not apply any transition to the other virtual cams. Have you tried you script through Cinemachine or knows some workaround. All the best!
Dear Vital, Thank you so much for this very useful repo. It worked perfectly for an academic app I developed some time ago. My actual project involves using Unity's Cinemachine - an asset that manages multiple cams and eases working with multiple cams. The problem is that it does not use conventional cameras, but "virtual" ones. Thus, if i attach your script to all virtual cams it does not work, and if i attach to the CineMachineBrain (the object that manages all virtual cameras), it works for a single cam, but it does not apply any transition to the other virtual cams. Have you tried you script through Cinemachine or knows some workaround. All the best!
Hi gui-vasconcelos, I glad that my repo is useful, thanks. I not used Cinemachine but I can look how it working and try to find workaround.
P.S. I found that Cinemachine is working if changing camera culling mask from Nothing(I set this for 7% FPS boost in S3D mode) to Everything or Default
Hi Vital. I've managed to 'bypass' using Cinemachine and now it is working fine. But thank you anyway. I'd like to ask you another question. When I close the Stereo3d window (pressing Tab) I lose my mouse pointer. Is there a way to avoid this?
Thanks!
Guilherme Nunes de Vasconcelos
Professor Adjunto | Adjunct Professor
Escola de Arquitetura da UFMG - Departamento de Projetos f: +55 31 3409 8812
Em sex., 19 de ago. de 2022 às 19:20, Vital @.***> escreveu:
Dear Vital, Thank you so much for this very useful repo. It worked perfectly for an academic app I developed some time ago. My actual project involves using Unity's Cinemachine - an asset that manages multiple cams and eases working with multiple cams. The problem is that it does not use conventional cameras, but "virtual" ones. Thus, if i attach your script to all virtual cams it does not work, and if i attach to the CineMachineBrain (the object that manages all virtual cameras), it works for a single cam, but it does not apply any transition to the other virtual cams. Have you tried you script through Cinemachine or knows some workaround. All the best!
Hi gui-vasconcelos, I glad that my repo is useful, thanks. I not used Cinemachine but I can look how it working and try to find workaround.
— Reply to this email directly, view it on GitHub https://github.com/Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1#issuecomment-1221132533, or unsubscribe https://github.com/notifications/unsubscribe-auth/AENGOFBMC5YVVQUZNFE7OUDV2ACB7ANCNFSM4ZOT3VIQ . You are receiving this because you commented.Message ID: <Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1/1221132533@ github.com>
I'd like to ask you another question. When I close the Stereo3d window (pressing Tab) I lose my mouse pointer. Is there a way to avoid this? Thanks!
Find "Cursor.visible = false;" and comment the line(remove). Also make same with the "Cursor.lockState = CursorLockMode.Locked;" to avoid locking cursor.
Thank you so much. Done! ;)
Guilherme Nunes de Vasconcelos
Professor Adjunto | Adjunct Professor
Escola de Arquitetura da UFMG - Departamento de Projetos f: +55 31 3409 8812
Em ter., 23 de ago. de 2022 às 20:01, Vital @.***> escreveu:
I'd like to ask you another question. When I close the Stereo3d window (pressing Tab) I lose my mouse pointer. Is there a way to avoid this? Thanks!
Find "Cursor.visible = false;" and comment the line(remove). Also make same with the "Cursor.lockState = CursorLockMode.Locked;" to avoid locking cursor.
— Reply to this email directly, view it on GitHub https://github.com/Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1#issuecomment-1224972499, or unsubscribe https://github.com/notifications/unsubscribe-auth/AENGOFDSBUSRGJL2T6UBRKLV2VJ3NANCNFSM4ZOT3VIQ . You are receiving this because you commented.Message ID: <Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1/1224972499@ github.com>
Dear @Vital-Volkov
I'm using Unity 2021.14 LTD (Locked to this version) I have confimed that Quad buffer stereo is working on my system with older applications however following your steps I cannot seem to get my project to activate stereo QBS when loading.
The enable stereo boolean seems to be false in this version of unity and overriding it still doesn't render them in quad buffer. I have the XR plug-in management kit on with no plug in providers selected, is this correct?
YOur demos above work perfectly well on the system (AMD Quad Buffer with 2x 4K Displays). Would it be possible to grab the whole project so i can test a build out fully my end?
Dear @Vital-Volkov
I'm using Unity 2021.14 LTD (Locked to this version) I have confimed that Quad buffer stereo is working on my system with older applications however following your steps I cannot seem to get my project to activate stereo QBS when loading.
The enable stereo boolean seems to be false in this version of unity and overriding it still doesn't render them in quad buffer. I have the XR plug-in management kit on with no plug in providers selected, is this correct?
YOur demos above work perfectly well on the system (AMD Quad Buffer with 2x 4K Displays). Would it be possible to grab the whole project so i can test a build out fully my end?
I even not have installed XR kit. Unity's builtin DirectX 11 S3D(you call it QBS I guess) plugin activated via Editor Script file "VR_SDK_Enable_EditorMenu" in "Editor" folder(as result in Unity Editor must be new Menu "Vr SDK") selecting "Vr SDK->Build with Stereo3D". And you can see working DirectX 11 S3D only in builded project(not in Unity Editor) with the plugin on Windows 8.1+ and enabled S3D in Windows video driver settings.
Do you see "Vr SDK->Build with Stereo3D" menu in Unity 2021.14 LTD while you have "VR_SDK_Enable_EditorMenu" file in "Editor" folder?
Dear @Vital-Volkov I'm using Unity 2021.14 LTD (Locked to this version) I have confimed that Quad buffer stereo is working on my system with older applications however following your steps I cannot seem to get my project to activate stereo QBS when loading. The enable stereo boolean seems to be false in this version of unity and overriding it still doesn't render them in quad buffer. I have the XR plug-in management kit on with no plug in providers selected, is this correct? YOur demos above work perfectly well on the system (AMD Quad Buffer with 2x 4K Displays). Would it be possible to grab the whole project so i can test a build out fully my end?
I even not have installed XR kit. Unity's builtin DirectX 11 S3D(you call it QBS I guess) plugin activated via Editor Script file "VR_SDK_Enable_EditorMenu" in "Editor" folder(as result in Unity Editor must be new Menu "Vr SDK") selecting "Vr SDK->Build with Stereo3D". And you can see working DirectX 11 S3D only in builded project(not in Unity Editor) with the plugin on Windows 8.1+ and enabled S3D in Windows video driver settings.
Do you see "Vr SDK->Build with Stereo3D" menu in Unity 2021.14 LTD while you have "VR_SDK_Enable_EditorMenu" file in "Editor" folder?
Hey,
I do yes, i select the Build With Stereo Option, then i do a normal build and run it but stereo doesn't seem to turn on. I have the Windows S3D enabled and other S3D Applications like the demo work too.
I can confirm in unity 2019, this does work perfectly fine and builds for stereo, however in Unity 2021 it doesn't work nor does it work in Unity 2020 (We need C# 8)
It seems it was removed entirely in 2020.2, running c# 8, 2020.1 still supports it but runs C# 7.3. This would explain it
Hi VRS3DGuru2!
No problem, I'll make it ;)
I've already made my S3D system for UE5 and now making the UI as a 3D widget attached to the camera so I have an S3D UI with adjustable depth. After release for UE5 I'll make the same update for Unity.
When I tested one pass S3D method implemented in Unity it did not work with required camera's separation(view difference) and only engaged at only specific low separation(too small view difference) and I didn't see performance boost anyway so it must be tested.
If you need a sequential method I can add it too.
Best Regards
Hi there, please reach out to me if someone would like to discuss contract work to make this happen.
Our company has built built novel hardware to bring back active 3D to displays and TVs. Please visit www.athanos.com to see what we are doing (there are plenty of videos that go over the details in the MEDIA section of the page).
I have started looking over the Unity XR SDK and I believe it's possible to get things working again, but would prefer to first validate and then possibly contract out the implementation to an expert in the field.
You can contact me through the Contact page or reach out to me directly at peter@athanos.com
Thank you! Peter
Hi everyone,
Thanks for all the interesting discussions and thanks @Vital-Volkov for this great repository ! Happy to see S3D is not dead, and there are still a bunch of passionate users like us 💪
If I had to enable back a frame sequential S3D method, with the aim to get shutter glasses synced directly from Unity, this is where I would start my investigations: this plugin is for Nvidia 3D vision but anyway it gives us an overview how to implement S3D using a native plug-in. I would try to implement a native plugin using D3D code and then I would try to implement a working Unity XR SDK version.
What do you think ?
Emmanuel
Hi Emmanuel,
This makes sense to me. I need to look deeper into the implementation of the Nvidia3D vision plug-in.
My 'ideal' XR plug-in would do the following: It would integrate perfectly into any Unity project pipeline and when turned on, it will handle the buffer presentations at the frequency of the display. Much like time warp requirements in VR, there must be intervention at the display frequency to make a decision: do I show the previous L/R pair because the new L/R pair hasn't come in yet due to CPU/GPU stall?
As well, it would require some extra data attached to submitted buffers that can be passed down from Unity. Mainly Left or Right frame as well as frame number (so something like L1, R1, L2, R2 etc.)
Having a solid framework where experimentation can take place at display frequency, along with optimized blits of buffers (if required; it would be better to do the target buffer flips in GPU memory IF possible) would be a really good start. I'm assuming full frame frame submission here at this time, where I am rendering Left on one frame and Right the following frame at full resolution. Down the road, using FFSBS or TAB modes and converting them to the proper output (for active, passive or glasses-free) should be considered.
Best, Peter
Hi there, please reach out to me if someone would like to discuss contract work to make this happen.
Our company has built built novel hardware to bring back active 3D to displays and TVs. Please visit www.athanos.com to see what we are doing (there are plenty of videos that go over the details in the MEDIA section of the page).
I have started looking over the Unity XR SDK and I believe it's possible to get things working again, but would prefer to first validate and then possibly contract out the implementation to an expert in the field.
You can contact me through the Contact page or reach out to me directly at peter@athanos.com
Thank you! Peter
Hi Peter!
I explored your videos. I also made tracking for my S3D system in 2015 with TrackIR5 to be able free tilt head(circular polarized glasses allow this but only give purple ghosting max at 90deg and at 180deg clear again) and move relative screen - https://youtu.be/zYCOV7fKqrI
TrackIR5 is not precise and has lag but now with HMD tracking quality such function is must have)) So you can play non VR games in S3D on virtual screen in VR and see 3D world like real world through window with respect to movement relative the window.
As I understand from your media, regular OLED displays are good for sequential but to minimize crosstalk ghosting black frame insertion is required.
But what is required from software except for switching left-right image each frame. OK ideally active glasses should receive info which frame is left but why do you need frame number?
Hi everyone,
Thanks for all the interesting discussions and thanks @Vital-Volkov for this great repository ! Happy to see S3D is not dead, and there are still a bunch of passionate users like us 💪
If I had to enable back a frame sequential S3D method, with the aim to get shutter glasses synced directly from Unity, this is where I would start my investigations: this plugin is for Nvidia 3D vision but anyway it gives us an overview how to implement S3D using a native plug-in. I would try to implement a native plugin using D3D code and then I would try to implement a working Unity XR SDK version.
What do you think ?
Emmanuel
Hi ebadier, thanks!
Yes, a desktop S3D must live and it has absolutely precision geometry unlike VR lense distortions I can see in Quest 2)). I also have some experience with Unity's native plugin so I think it's no problem to expand required functionality of the engine).
Hi Vital, thank you for the message.
I will admit that I don't know the best approach to implementing a quad buffer in a software-only solution. My guess is that a brute force method of tagging buffers as they come in can help differentiate which ones should be used at any time. If there is major lag from rendering and the frame-rate drops from say, 60 to 15, there may be logic that indicates that the previous (back buffer) frames are 2 frames behind.
I do see value in adding extra data for each buffer that will allow for passing of info from main thread and graphics thread. This would help in debugging as well.
Quick question to everyone: using the Unity XR SDK, does the graphics thread get a chance to run code at the refresh rate of the monitor, or is there a chance that a processing frame can be skipped due to a GPU stall? When I was at Oculus (early days), we were just starting to explore and implement time warp for dropped frames, but an active stereo display has the requirement of the display needing to be fed a L or R frame on every single frame. A stall on a monolithic L/R frame in VR is subjectively OK since it won't cause the eyes to cross, but it will cause judder.
Finally, I don't know if the 'compositor' concept in VR is a separate app (like a Spout receiver, for example), or if it's a separate dll that communicates within the app alongside the XR plugin, OR if it's really just an XR plugin itself. I would hope that 'compositor' like functionality can be built into an XR plugin, provided the GPU thread is keeping up with the display frequency.
Thanks Peter! Exactly, I forget about stall)) Just switching L/R images each frame will work correctly only with performance headroom when FPS potential is not lower than screen Hz but if render is not ready for next screen refresh then repeat previous image is required. So I need to get an active S3D setup for testing all this before adding it to my plugin.
Thanks for the info VRS3DGuru2! Yes, NVidia blocks all S3D options that was in old driver, restrict everything and remove S3D from driver so forget about their S3D)) They brings problems for community and many hours of work to solve it)) And their S3D performance not even close to native built in project S3D and can't be possible as overhead recalculation(stereoryze) each shader on the fly. Anyway their S3D settings system is incorrect, ruined with zoom and required to replace or another overhead calculation to fix it))
So we need to make a universal S3D plugin for developers to build into projects, maximum possible independent from 3D drivers software and this is the best and maximum performance way for S3D.
Hi VRS3DGuru and Vital,
Thank you both for detailed responses! Currently, I encode the information of which frame is left and which is right directly onto the display buffer (in Unity, I render on GUI in main camera to ensure that it's the last thing written into the buffer). I use 2 cameras, one for left and one for right, and every other frame I shut down the previous camera, and also ensure that I step main code logic through only once for the pair of cameras.
I cannot guarantee that anything time-related which should be run for both left and right at the same time are synced 100% (i.e. physics, animation etc.) but I have enough checks and failsafe's in place with my demos to ensure camera movement is perfectly synced.
The SYNC device reads directly off the display and keeps the active glasses in sync, even if there are frame drops (the glasses eventually self-correct). However, the issue of frame dropping is the first problem I want to tackle, so I propose a quick solution to ensure that we can feed the SYNC device with a steady signal to mitigate the frame drops in expense of temporal fidelity.
Instead of fully developing a quad buffer, I would like to ensure I have a 'previous' left or right buffer in place that can be switched to, even if it's not up to date. The outcome of this is that there should never be a frame drop that will de-sync the glasses, at the expense of having a temporal shift due to the fact that both eyes are potentially not seeing in sync (i.e. left eye is frame 2, while right eye is frame 3, for example). However, utilizing the reprojection capabilities of the XR SDK and building a simple solution like this should validate if true Quad buffering can be done. Also, this solution is built upon an agnostic hardware synchronization platform being developed at Athanos, and should continue to stay as detached to any caveats that may come up, from using anything that ties to a specific API or GPU architecture. I am compiling the Unity demos with DX11 at this time, which I feel is a stable and generic enough windows-based graphics API, and would like to continue using it.
What do you think?
Hi Adrian, Of course I am interested. :) What are the components and where are you from?
On Wed, Apr 19, 2023 at 3:03 AM adriandrewes @.***> wrote:
So I need to get an active S3D setup for testing all this before adding it to my plugin. Hello Vital, because I am very interested in the development I could provide free components for an active stereo setup. Are you interested?
— Reply to this email directly, view it on GitHub https://github.com/Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1#issuecomment-1514466085, or unsubscribe https://github.com/notifications/unsubscribe-auth/AR7YEBCNCO4ZCJO64ZFBAYTXB6Z6LANCNFSM4ZOT3VIQ . You are receiving this because you were mentioned.Message ID: <Vital-Volkov/Stereoscopic-3D-system-for-Unity-2019-/issues/1/1514466085@ github.com>
Hi Peter, thank you too for the info! I think anyway I need to get an active S3D hardware rig for testing while developing and I think I'll add active S3D to my system for different 3D API's. Of course we should be detached from specific API or software if possible.
Hi, I am also interested in finding a solution for quad buffer rendering with newer Unity versions as we are using Unity in a CAVE szenario with multiple projectors and shutter glasses. Although I am more versed with high level Unity C# code than graphics APIs I would be glad to help with developing it.
I would have intuitively chosen OpenGL as it was the working solution till recently and it could also work on linux. But @VRS3DGuru2's post made me questioning that.
If we fall back to using the re-added OpenGL Core that is back available in Unity 2022, then it breaks URP/HDRP
I could not find any indications for URP/HDRP being incompatible with OpenGL or are you referring to DX specific raytracing features?
DX12 does not have natively exposed stereoscopic out of the box, although microsoft added a true quad buffer support in the API, Nvidia has not yet leveraged it as they have for OpenGL QuadBuffering.
@VRS3DGuru2 do you have a source for the DX 12 stereo support as I could not find anything about that. And what do you mean by Nvidia has not leveraged it? Does it mean you can force nvidia cards to work with it somehow or does it mean it does not work at all?
I searched google, bing and even bings AI for DX12 quadbuffer and the only thing any of the search engines finds is this exact thread. Do you have a link to the docs please? The only thing I could find was this DX 11 stereo rendering example (which is 4 years old): https://github.com/microsoftarchive/msdn-code-gallery-microsoft/tree/master/Official%20Windows%20Platform%20Sample/Direct3D%20stereoscopic%203D%20sample
There is opportunity to solve this with a stable, non-drifting (no rolling sync) using DX12 API and game engine directly, no need for GPU programming other than accessing timing and genlock.
What kind of genlock would you use for this? Are you talking about G-Sync / Freesync? And would that work with multiple monitors / projectors?
Does anyone have any other information about how stereo quad buffers work with DX12, or with Unity? I'm trying to investigate the viability of accomplishing something similar with renderer features for URP/HDRP but would vastly prefer to hook into a lower level method if possible.
There seems to be vastly little public knowledge on this topic.
Hi! As I understand, Quad Buffer is supported only on certain devices like Quadro GPUs and certain monitors. We should avoid any restriction if possible and now I am thinking about how to manually draw render textures(custom quad buffer) each hertz of monitor independent from GPU FPS and if we found a solution then active shutter stereoscopic will work on any 3D API, any GPU and any monitor. I am now searching code examples of shutter output methods on the net to see how it was implemented before NVIDIA 3D vision and Quad Buffer existed at all.
The first 3D shutter glasses only worked with the graphics card driver suitable for stereo output. We used Elsa shutter glasses back in 1999. However, these glasses were already running with NVidia quadro drivers. Have a look at this page. http://www.stereo3d.com/revelator.htm
The first 3D shutter glasses only worked with the graphics card driver suitable for stereo output. We used Elsa shutter glasses back in 1999. However, these glasses were already running with NVidia quadro drivers. Have a look at this page. http://www.stereo3d.com/revelator.htm
I also have edimensional 3D glasses somewhere :) https://www.guru3d.com/page/edimensional-e-d-3d-glasses/
The SYNC device reads directly off the display and keeps the active glasses in sync, even if there are frame drops (the glasses eventually self-correct). However, the issue of frame dropping is the first problem I want to tackle, so I propose a quick solution to ensure that we can feed the SYNC device with a steady signal to mitigate the frame drops in expense of temporal fidelity.
Hi Peter! Do you already solve the task to show the left/right image at screen refresh while low render FPS?
I found in Unity offscreen rendering to texture using coroutine even when the main camera is disabled, FPS of black screen(render nothing) in Unity player drops to low due to offscreen heavy rendering to texture. Problem is the render process in Unity can't be parallel or detached from the main camera to show every screen refresh, only when all cameras in scene finish rendering then something can be shown on screen, even just clear screen with color waiting for all cameras rendering to complete. I even looked into native plugin and made a D3D11 test app but IDXGISwapChain::Present has the same behaviour and causes the main thread to wait for a render.
So I see solution as light rendering in main thread only screen quad with S3D shader which just compose output methods from already rendered textures and this lightest operation will guarantee show left or right image from last rendered textures(2 buffers of 4) at each refresh of screen independent from rendering process to current textures(last 2 buffers of 4) on separate offscreen thread. When rendering is complete then just switch pointer to new 2 completed textures in memory and begin rendering to old 2 or even additional 2 to avoid render/show waiting each other. So you have custom made quad or six buffers like this. :)
Hello Vital, thank you for the informative message.
I have not yet solved this task and am still wanting to. Your solution sounds very promising. Is it possible to code this up quickly to verify that it works? I can test on my end and if it does, I would be happy to send you a SYNC device so you could get it working on your end.
I'm currently eyeing the following display monitor:
This monitor apparently has black frame insertion in firmware for 120Hz refresh. Since it's a 240Hz monitor, it basically throws the black frame on it's side, which is great news since it will deterministically show a black frame even if the frame rate slows down on the compute side. It's supposed to go on sale tomorrow (fingers crossed); I am going to purchase it to test it out.
Between the SYNC device, firmware BFI, and finally solving the quad buffer solution in Unity, we may see S3D be a thing again soon!
Please let me know if it's possible to test the solution you have outlined above. Thank you!!
I have not yet solved this task and am still wanting to. Your solution sounds very promising. Is it possible to code this up quickly to verify that it works? I can test on my end and if it does, I would be happy to send you a SYNC device so you could get it working on your end.
Hello Peter, thank you too! I am now studying how to implement this in code and as soon as I get it working I'll upload a test app to try.
Hi! I modified this MultithreadedRenderingD3D11 exaple to render to texture and load GPU below 60(monitor refresh rate) FPS, unfortunately looks like any rendering process even on different device context affects front/back screen buffer flip frequency of empty scene(just color clear screen) using present IDXGISwapChain::Present. I also make a multi CPU thread D3D11 test program and it has the same behavior. So I think low level direct control of swap buffers is required independent of GPU workload and not possible with IDXGISwapChain::Present
I found this "To continue using quad-buffered stereo, developers must switch to the Microsoft native (DXGI) stereo APIs." and I'll look into it. And also this - IsWindowedStereoEnabled
Hi! I made Remake of Direct3D11.1 native stereoscopic sample Added my precision S3D settings system and forced stereo3D On/Off independent of 3D display mode in Display settings->Advanced Display Settings Tested working on my LG D2342P monitor with NVidia 452.06 and 3D Vision driver installed separately by 3D Fix Manager 1.85 Also should work with active S3D hardware and shutter glasses so give me feedback how it goes.
I am now trying to get active shutter S3D works on my passive monitor using EDID override of the active LG W2363D monitor for testing and further development.
Hi @Vital-Volkov and thanks for the amazing plugin !
We are trying to port our HMD app to also work with Stereo 3D Displays (such as powerwalls). We have confirmed that your build (as seen here) works :
Here is my problem : whenever we try to make even a simple app with your system enabled, the checkbox for native stereo ( "DirectX 11.1 S3D" ) can't be checked, just as if the driver didn't support it.
I am sure this is just a configuration problem in our project, but the settings have changed compared to when you wrote the readme and I can't find what I'm missing. We are currently on Unity 2021.3 and can't move backwards. Here's what we have checked :
Can you nudge me in the right direction ? Thanks again.
Hi Boris, thanks!
I just built 2 old branches with D3D11 S3D Unity native plugin and confirmed D3D11 S3D not working. I think this is because I am on a 452.06 NVidia driver and D3D11 S3D is not enabled correctly ("3D display" mode does not exist in Windows display settings). It should work with last S3D official NV driver 425.31 but I can't check this because my RTX 2060 12GB version is newer than common RTX 2060 and not working with 425.31 driver even with force install it like I do with 452.06 via installation *.inf file mod.
I just added the default App Package to https://github.com/Vital-Volkov/Remake-of-Direct3D11-native-stereoscopic-sample so you can install it(first uninstall remake) and run. I think you'll get the message "Stereo 3D is not enabled on your system" in the top left corner of the app but it should work with 425.31. My remake works with 452.06 because it ignores buggy checks of D3D11 S3D availability and just forces using S3D functionality of the NV driver.
Anyway I'll soon add D3D11 S3D directly to my Unity S3D system as native plugin and it will directly work the same as the UWP app remake.
I'm hesitant to chime in here because I don't fully understand the conversation, but I think I can possibly add some info that may help you all. Our modding group has used NVidia 3D Vision for years, and we continue to use it today, even though it's a canceled project. (HelixModBlog) Apologies in advance if this is not relevant.
The 3D Vision Driver can still run on current drivers, but is not normally installed now because the product is defunct. However, we know that the DX9 code path still works in modern drivers if we force install the 3D Vision Driver. We can still play DX9 3D games, and use stereo photo viewers on 3D Vision hardware. The DX11 code path was destroyed when they released the 3xxx series, and the last working driver for that path is 452.06.
I don't understand the relationship of QuadBuffered OpenGL to the drivers. I know that still exists for their professional series Quadro cards, but I don't know how that translates into consumer drivers. We know that some old time games like Doom still work, which strongly suggests QuadBuffered is still available. Also, I believe that they want to maintain HDMI compatibility for 3D.
It's possible to force install the 3D Vision driver itself, and I wrote a tool to do it fully automatically after driver updates. See: https://github.com/bo3b/3DV_Installer. You can also use the 3DFM or HelixVision as suggested above, but I thought the source code can potentially be helpful.
That tool changes the versions and calls the 3DV driver to install on any driver. It will then show up in the NVidia control panel as well, and allow you to enable/disable 3D Vision directly. As I note above, this may not be helpful, because you may not care about 3D Vision specifically.
Once installed, the 3D Vision Driver will show 3D TV Play if it finds hardware on its white list for old 3D TVs or Projectors. Spoofing the EDID can allow you have it see what you want. Without any EDID, you can always get CRT mode, which allows red/blue 3D Vision Discover.
Hi @Vital-Volkov and thanks for the amazing plugin !
We are trying to port our HMD app to also work with Stereo 3D Displays (such as powerwalls). We have confirmed that your build (as seen here) works :
I added the D3D11 output method as native rendering plugin for quick testing. Editor script will uncheck "Use DXGI flip model swapchain for D3D11" in "Unity Project Settings->Player->Resolution and Presentation" to make the rendering plugin work.
This package is fantastic, it would be great if it could support Stereo-QuadBuffer for Active Stereoscopic System (Cave, Virtual Wall, etc). Unfortunately Unity has removed support for Stereo (Non HMD) systems in 2020+ and it appears that it is required to go through the XR plugin management system in orderd to rebuild that functionality. What do you think about the matter?