Open flavourous opened 5 years ago
I do not use "look to activate" on the panels. Personal taste, usually I use 1-2-3-4 for the panels, so I've tried adding holo buttons for that which is a good start.
It's a shame ED doesn't have a single "focus the panel I'm roughly looking at" button you could just bind to one of the POV switches. Center press is probably not used in combat and would work nicely for that.
I wonder how ED's contextual menus behave if you bind center press to one and then put the list of panels in it.
- Add holo buttons for directional navigation AND panel selection? (could be a bit slow?)
Master does have holo buttons for these. But they're really completely for testing/debugging things. They're completely impractical because they take up 6 overlays from your limited pool of overlays. If I were serious about this it would have to be a single ui panel instead.
However I always questioned this method. Because there are panels all around you and you can't leave the panel hovering on one side of the cockpit while looking at a panel on the other side. And moving it around or having two just feels like trying to force something to work with a shoehorn.
- The virtual thruster doesn't seem to set the hats, so I cant use that for panel navigation.
I've left POVs out on the throttle for now on purpose. I need more people using the overlay giving input on what they might want to do with it before I can figure out what the best thing to do with it would be. There are various ways it could be used. And I'm also not sure whether I actually instead want to reserve something like pressing the bottom of the trackpad to something like a reverse lock (#47).
Also I'm not sure that it would help much for UI navigation. The throttle is resistant to rotational movement unlike the joystick. However the throttle is not resistant to translational movement. And because of that unlike the joystick you are stuck having to float your hand in mid air holding the throttle while trying to use the POV to navigate the UI.
☹️Box areas somewhere in the panel vicinity which make the controller switch to UI navigation mode (and maybe focus the panel) when you hold your controller in them? ...but I can't really represent that kind of 3d zone in 2d overlays to show where they are in edit mode.
Laser pointers would be the ideal. But that can't be done without game changes.
Yeah I agree. Debug buttons helpful but not the solution.
Regarding the overlay limit, have you considered using button panels to group buttons to reduce the count? Also helpful to align and manipulate multiple buttons
ED already does tell us when you focus the internal/external/chat panels in the cockpit, even whether you are using button focus or look to focus. So we could already do that for panels. However that's not going to work out well. There is a bit of lag between when you focus something in the cockpit and when the overlay picks up on the GuiFocus state change. This would have the result of going to your targeting panel to select something and then having to wait a few moments before you are able to fire your secondary weapons or use your POV controls.
You can get a feel for how bad this delay will be if you open the galaxy map and pay attention to how long the cockpit controls stick around in the galaxy map before they disappear. The overlay immediately disables them as soon as it reads the GuiFocus change in Status.json. So this is the same delay that panel based context switching would have, except it would be in a scenario where you're trying to quickly switch between focuses.
Regarding the overlay limit, have you considered using button panels to group buttons to reduce the count? Also helpful to align and manipulate multiple buttons
Yes, I did early on. But frankly it's hard enough to deal with a bunch of small overlays and non-dynamic UI based overlays like the 6DOF controller and edit panel. Trying to implement any sort of resizable panel or snap and merge behaviour would be a nightmare.
Edit: I would love it if Valve made it possible to make 3D HMD based overlays. i.e. Instead of overlaying a bunch of 2d panels. There would be a single camera from the HMD perspective that would render an overlay over top the game. That kind of overlay would be more expensive for multiple overlay applications to use. But would be much cheaper than a single overlay application with dozens of 2d overlays.
The pie in the sky dream of course would be if Frontier allowed us to actually inject (shaded) 3d textured models into the game's own rendering path. If not dynamically, perhaps as a mod that I could make a bridge to the overlay to control. That would be a bit cheaper to render; that would eliminate the overlay limit; that would allow for dynamic 3d objects; that would eliminate the moving offset between in-cockpit elements and overlay elements; and that would also as a bonus mean that holo buttons can be "behind" parts of the cockpit instead of messing with your depth perception when they intersect.
I was looking at GuiFocus just now, is the update speed related to WaitForSecondsRealtime(1f)
or just the update frequency from elite? Maybe FileShare.ReadWrite
helps, unless elite deletes rather than truncating the file. A hybrid approach combining known mode switches (e.g. pressing a button bound to switch panel, grabbing a stick) with status update might help too.
Wouldn't a HMD perspective overlay mean you lose the ability to render things relative to the game world? Not sure I understand anyway, I'm new to this stuff. Actually, I was messing around adding a quad to the scene (under ShipCockpitUI, with the interaction/UI layers, mesh renderer with some random texture) but I couldn't see it in-VR when running the game, can you help?
I was looking at GuiFocus just now, is the update speed related to
WaitForSecondsRealtime(1f)
or just the update frequency from elite? MaybeFileShare.ReadWrite
helps, unless elite deletes rather than truncating the file. A hybrid approach combining known mode switches (e.g. pressing a button bound to switch panel, grabbing a stick) with status update might help too.
Currently the delay may be related to the 1s delay between reads of the Status.json file. We might be able to reduce this part of the delay by using a FileWatcher so we can read the file the instant that it is updated. However there is no guarantee that ED writes to Status.json as soon as the GuiFocus changes. They may also do an internal polling loop that checks for changes to write at an internal interval.
FileShare.ReadWrite
won't help this issue. Though I'll keep it in mind since it looks like something that could avoid a rare race condition where ED tries to write to the file the moment we're reading it and causes the file open to fail.
Wouldn't a HMD perspective overlay mean you lose the ability to render things relative to the game world? Not sure I understand anyway, I'm new to this stuff.
The overlay would be fixed. The game view is rendered using an in-game camera that follows the HMD's viewpoint. And SteamVR handles overlays by doing a similar 3d space render from the HMD's viewpoint to the 2d overlays you declare in 3d space; except this rendered image has a transparent background so it can be laid over top the game view making them appear to be part of the same 3d space.
My suggestion is basically allowing me to skip the "declare 2d overlays and their positions for SteamVR to render from the HMD's perspective" and instead use a single camera in Unity that tracks the HMD (just like the game does) to render a transparent overlay directly.
Actually, I was messing around adding a quad to the scene (under ShipCockpitUI, with the interaction/UI layers, mesh renderer with some random texture) but I couldn't see it in-VR when running the game, can you help?
Absolutely nothing in the scene is visible in VR. The scene and the tracking going on in there is purely there to help visualize things for development and make use of Unity's Transforms to make the complex matrix math easy.
Anything visible in VR is actually a custom OpenVR overlay. Some of the game objects represent those overlays (like the buttons and the "{...}Overlay" objects in some of the UI based panels). In general these objects create an OpenVR overlay, take a 2d texture (a flat image) and set the overlay's texture to it, set the overlay color (which will be mixed with any white colors), and tell OpenVR where in 3d space the overlay is placed. The holo buttons bypass any unity content at all and just set the overlay's texture to the texture that comes from the image file that represents the button. The dynamic unity UI based stuff you see in the scene uses an orthographic Unity camera to render a 2d image of the UI to be set as the overlay's texture.
You won't be able to get any 3d object from Unity to render in the overlay. OpenVR overlays only support static 3d object models you set using an absolute path to the file on disk. And I haven't even experimented to see how well this works yet.
Ok I get it - give SteamVR a whole scene to render over what you're seeing, rather than individual smaller overlays.
Thanks, I see you're adding those Holographic{...} components to stuff you want to push to OpenVR. Makes sense now :)
As you say, a lot depends on how often status.json is updated. I'll see if a more frequent File.GetLastWriteTime
plus FileShare.ReadWrite
can help me get at them panels.
So I'm running some "ungrabbed outputs", to a second vJoy device that are active when the respective controllerinteractionpoint isn't grabbing anything. Other virtual control surfaces or better GUIMode detection would probabbly be better, but I'll be using this while that stuff is figured out. +1 for that ActionManager, v.helpful :)
https://github.com/flavourous/elite-vr-cockpit/tree/feature/ungrabbedVjoyOutputs
This is where I use VoiceAttack or VoiceMacro to lock those menus. And I have a voice command to back out of them too
Though more recently I've gotten used to pointing my nose just right to keep the menus up.
Yeah, that's just so I can play while the other stuff is figured out. Another control surface (e.g. holo keyboard that you just grab) could be cool - It's fun to loose control of your ship in order to work with panels - makes sense, you have to grab different things to interact with them. Dunno though.
☹️Box areas somewhere in the panel vicinity which make the controller switch to UI navigation mode (and maybe focus the panel) when you hold your controller in them? ...but I can't really represent that kind of 3d zone in 2d overlays to show where they are in edit mode.
Actually I suppose I could just represent the area with a 2d rectangle and not precisely represent the vertical volume visually.
Still a little unsure about adding boxes that would essentially disable your ship controls whenever you put your hand in them and could accidentally do so without undoing that when you leave.
I may also need to think about adding some sort of tab switching area to the box. Because UI controlled box regions like this would be a one-handed operation and 2 axis input aren't always available.
Edit: Although this idea may not work as well for the chat panel as lifting your hand up every time you need to view mission messages may be annoying.
I'm having trouble navigating the panels:
I do not use "look to activate" on the panels. Personal taste, usually I use 1-2-3-4 for the panels, so I've tried adding holo buttons for that which is a good start.
Navigating panels can only be done by binding the hats to ui navigation, however:
Having to change panel and navigate it is clumsy: ungrab-panelbutton-grab-naviagate...