Yellow-Dog-Man / Resonite-Issues

Issue repository for Resonite.
https://resonite.com
118 stars 1 forks source link

Full access to the visual manager #2033

Closed Mr-CostlessClaw closed 1 month ago

Mr-CostlessClaw commented 1 month ago

Is your feature request related to a problem? Please describe.

The visual eye stream manager is not able to be modified. Ways of adjusting the eye streams can be janky and add visual instability.

Describe the solution you'd like

I'd like to have the option to inspect the visual eye manager and to be able to decouple the two streams placment.

Describe alternatives you've considered

none

Additional Context

No response

Requesters

Mr. Claw ErrorJan Techno

Frooxius commented 1 month ago

I don't understand what are you asking for.

What is "Visual Eye Manager"? What options are you looking to have exposed?

In Resonite, generally everything that's already there is exposed automatically, so I'm not sure what options are you missing.

What is the goal of what you're trying to achieve?

Mr-CostlessClaw commented 1 month ago

I want to move the individual left and right eye stream independently of each other and to be able to rotate them in opposing directions. This would allow me to place the individual eyes further apart or closer together to match up with the eyes of an avatar. To be able to align the eyes to an avatar. Like adjusting the digital IPD that is default when an avatar is made.

Frooxius commented 1 month ago

What do you mean by "eye stream"?

I still do not understand what "Visual Eye Manager" is, there's no component with such name.

From the sound of it, you just want to tune where the eyes are on the avatar? For that, you can already do that with the EyeManager component - you can place arbitrary number of eyes on the avatar and place them wherever you want.

ErrorJan commented 1 month ago

Moving each User's eye seperately from each other, could be useful. It can make for some additional fun interactivity, like if an user's eye could be moved to a different place so one eye is still by the actual avatar and the other is on the ground somewhere else (attached to an avatar eye mesh). It could also be used for some other interesting purposes like adjusting the IPD to an avatars IPD. This does seem like a strange feature request, but you can already kind of do it with the AvatarUserRootOverrideAssigner and set the Node Field to View and scale it until the Rendered view is the Avatar's IPD, it's actually kind of a fun experience. The problem with that component is that the view lags a lot, when placed onto the avatar's head, you no longer can interact with the local userspace that easily, because the laser is no longer appears in the same place as in the local user space and it wouldn't be as flexible as moving each user eye. While moving each user eye would probably still not fix the problem of the laser being offset in userspace, it would be an interesting additional way to have more flexibility in game. It would probably also allow partially being inside a portal, since one eye could be teleported to another place, while the other one stays where the rest of the body is.

Frooxius commented 1 month ago

I just don't understand what exactly is it that you're asking for.

You're using terminology here that I'm not sure what it refers to.

Are you able to answer my questions above?

ErrorJan commented 1 month ago

I thought I did answer them. I didn't mean to ignore your questions.

What do you mean by "eye stream"?

What is meant is the rendered view to the HMD, to be able to customize the location from where it's being rendered, so the per eye view can be moved to a different place.

ErrorJan commented 1 month ago

What is the goal of what you're trying to achieve?

The goal is to place the users eye (the individual view of the left and right eye) to a slot of the avatars "real" eye position and then maybe even be able to grab that avatars eye, so when the left eye gets moved the right eye is still there and the right side of the stereoscopic stream to the HMD stays at the avatar, but the left side of the stereoscopic stream is somewhere else in the world and no longer at the avatar. Does that explain it well? (I'm guessing that what is being sent to an HMD is one image stream making it a stereoscopic image)

Mr-CostlessClaw commented 1 month ago

Sorry. I really didn't have terminology for what I was wanting to ask. I was looking for it and really didn't have the best way of describing this. As ErrorJan said. The parts of the head that renders vision for each eye on the HMD. We are wanting to move them around. ErrorJan discovered you can scale the headset. The piece of the avatar which shows you where your vision is placed. To scale it up to match the eyes ingame that sends the visual steam to the HMD for the left and right eye. He did this to increase the IPD you perceive in game. In this method. It makes the vision quite unstable. We are hoping that more access would allow an easier method of moving these parts. As well to rotate them so the vision you see in your HMD would simulate the vision of your avatar if the eyes are further apart, closer together, or on the sides of the head like an avail.

Frooxius commented 1 month ago

I see, so you're looking for way to independently position where each eye renders from?

For this, we're not too likely to introduce this. This isn't something you can really "access" with how VR SDK's work - the positioning of the eyes is driven by the VR software, based on the IPD configured in your own headset. The rendering pipeline is also optimized around this, using single pass rendering to render things more efficiently. Making each eye render independently would require significant amount of work and would require disabling those optimizations.

Generally positioning the eyes like this isn't something you should really ever do, as this would result in distorted image that doesn't match your real eyes and result in significant eye strain. As a rule, rendering should match your physical eyes as 1:1 as possible.

Given that I don't think this is something we'd invest time into implementing.

ErrorJan commented 1 month ago

Ah, I understand. While I think it would be a neat idea, since it would require a lot of effort, it makes sense to not implement a feature that probably many players won't use... The jitteriness of the AvatarUserRootOverrideAssigner could be brought up in a seperate issue though. I might create that later, if I don't forget. Anyway, this can then probably be closed..

Mr-CostlessClaw commented 1 month ago

Well. Thanks for the explanation and sorry for the confusion. I tried to find a proper way of explaining this idea out. And thanks for your time in hearing this out. I chase after visual stimulating things since I am not affected as most others by these. Though thanks again.

Frooxius commented 1 month ago

Yes, I'm happy to explain! And no worries it happens. Generally I recommend explaining things in a very plain language and focusing on the end goal you want to achieve - in this case splitting what each eye sees.

The jitteriness is definitely a good issue to bring up separately - I think there might be an issue for that already?

ErrorJan commented 1 month ago

If there isn't I'll create a bug report tomorrow. Anyway, thanks for your time

lxw404 commented 1 month ago

https://github.com/Yellow-Dog-Man/Resonite-Issues/issues/1520