dragonkhoi / suvstar

1 stars 0 forks source link

Stereo Rendering is not working #2

Closed Zumbalamambo closed 5 years ago

Zumbalamambo commented 5 years ago

I have set e < 2 instead of e < 1 in SUVSTARPostRender. I have also updated the aspect ratio of my device properly and removed the canvas. But the display still remains monoscopic. How I may be able to render the display for both right as well as the left eye?

dragonkhoi commented 5 years ago

Hello @Zumbalamambo! We have implemented this functionality for you in the latest push. You will just have to adjust the aspect ratio of the render textures to match your phone resolution. Simply check the "TwoEyesSingleDevice" bool in the SUVSTARPostRender.cs. I'd also love to hear how you found our repo! Close if this fix worked for you.

Zumbalamambo commented 5 years ago

Thank you very much!!!!... Let me try it now and I will update you!

Zumbalamambo commented 5 years ago

It works like a charm.

I removed the ARCoreBackgroundRenderer script from FirstPersonCamera to get SeeThrough effect. But it appears to be like the object seems to be like it's not being tracked while I do so. Is it common or is there any possible fix that you can suggest to me? or is it possible to brighten the view so that the VR view is legibly visible even outdoors?

Thanks again for the wonderful code!

dragonkhoi commented 5 years ago

I'm not sure what you're asking, can you please provide a specific use case that you're having trouble with? The ARCoreBackgroundRenderer should be displaying the camera feed, so the see through effect should be working with it on. I have not tried tracking without it, but I assume it would not look correct, since I'm pretty sure ARCore anchors are based on the UV of the ARCoreBackground. We have used the AR video see-through outdoors. You can read more about outdoor use in our report (attached), or see our video: https://www.youtube.com/watch?v=aOOyccH9_Rw report_le_qian.pdf

Zumbalamambo commented 5 years ago

@dragonkhoi Thank you very much for sharing with me the report. It really helps me to understand the core principles behind the wonderful implementation.

I tried to track without using ARCoreBackgroundRenderer but the objects seem to drift. I think its because the field of view of the eye varies from the field of view of what mobile can see which makes it looks like the object is drifting. The drift happens on two occasions.

  1. When we remove the ARCoreBackgroundRenderer ( which I assume because of the mismatch in the field of view in the VR device that I`m using. ). The device that I use is here.
  2. If we update the position of an Andy object in the world space with reference to that of the FirstPersonCamera, the andy object drifts or moves with the camera. Have you experienced this bug too? If you could please kindly suggest me a fix, it would be really helpful.
Zumbalamambo commented 5 years ago

@dragonkhoi I have fixed the second bug. It happens when the coordinates of the first person camera are not fixed properly. Do you have any idea about the first one?

Also one last question, May I know how which unit those calibration parameters are at? especially screen width and screen height in SUVStarProfile. My screen to lens distance is 29 mm. How do I convert it to be able to use it with SUVStarProfile? I have enclosed the VR parameters of my device generated from VRCalibration is as follows,

Screenshot from 2019-06-17 16-43-01

dragonkhoi commented 5 years ago

Hello @Zumbalamambo! I'm not super familiar with ARCore, and we were only able to get it to do what we needed in the report. I don't know why the objects drift when the ARCoreBackgroundRenderer is off, but I have noticed that as well. If you want to keep it on while having a different background, you can put a quad that takes up the whole screen at the far clipping plane of the FirstPersonCamera. These parameters can be set in SUVSTARProfile on the ARCore device. The units for screen width/height and distance is in meters, so 29mm would be 0.029.

Zumbalamambo commented 5 years ago

@dragonkhoi I'm trying to do it as well. If there is any progress, I will share with you. May I know which VR device are you using? also, May i know why the Vertical lens offset is negative (-0.00025)?

dragonkhoi commented 5 years ago

It's just a generic VR headset we bought at Fry's, made for Cardboard 1.0 it looks like. These parameters are just values that I pulled from the standard Cardboard SDK device profile for a standard Cardboard 1.0. I would assume that the Vertical Lens offset is negative because the center of the lens is 0.25mm below the center of the screen. Really just play with the parameters until the barrel distorted image doesn't give you a headache, haha.

Zumbalamambo commented 5 years ago

Let me try the same sort of VR Device. Do you have any update on seethrough camera? I tried placing the quad to render such an effect but unfortunately, the object still appears to be drifting. :( I'm working on it as well. I have got one more enhancement tip for you. Have you considered turning off MatchFrameRate in SUVSessionConfig. This would give you higher fps of about 60.

dragonkhoi commented 5 years ago

Are you trying to have ARCore track objects but NOT show the camera feed? Why aren't you trying to show the camera feed? How are you implementing tracked objects? You are not supposed to give it a reference to the FirstPersonCamera, just anchor it to an ARCorePlane. You can figure out all these issues by looking at the Google ARCore docs. https://developers.google.com/ar/develop/unity/tutorials/hello-ar-sample Also, the reason we match frame rate is so that in our project, we used two separate phones, and their videos needed to be synced, so the frame rate had to be locked. If you are using one device, MatchFrameRate can be disabled, good call for performance. If you have questions about AR feed to stereo barrel distortion, those would be appropriate here, but all the tracking issues should be solvable via Google ARCore forums :) Thanks for checking out our repo!