aardvarkxr / hackathon-sep20

BSD 3-Clause "New" or "Revised" License
1 stars 0 forks source link

A framework to add intelligent occlusion and context awareness to Aardvark (Aperture) #9

Open Adil3tr opened 3 years ago

Adil3tr commented 3 years ago

What would this gadget do?

This gadget would allow Aardvark gadgets to have occlusion, and basic contextual awareness, in a roomscale or AR scene. This would be done in a “simple” or “advanced” mode based on the users needs. In the simple mode the user can place occlusion boxes, drawing them in VR through vector shapes (spheres, rectangles, triangles) and placing them over objects in the roomscale environment. Use of multiple shapes that clip through each other for a single object allows for higher fidelity. These boxes would then be used by other Aardvark apps to know to occlude anything rendering behind the occlusion box/object. This is a roomscale function however and meant for testing or when the user stops for a short period and wants to add occlusion to an object in a program. It would also define the floor and prevent gadgets from rendering beneath it, perhaps even being relied on to prevent gadgets from moving below the floor at all, and the same with the ceiling if necessary. A laser with an interaction point, extended and retracted with the joystick, would allow the user to create a zone from a distance or mark walls out of the roomscale boundaries. These boxes would be at half opacity until finished, and then activated in order to take effect and become invisible, occluding Aardvark widgets and passing though the VR layer.

The advanced mode would allow the user to not just place shaped occlusion boxes, but would allow them to recreate the scene they are in within aardvark, “tracing” the environment by placing walls, a floor and ceiling, as well as objects such as windows, picture frames, chairs, and more complex shapes or freedrawn occlusion zones. This would allow for more advanced occlusion.

Advanced mode would also allow occlusion boxes to be marked with a context like TV, window, door, couch, or chair. With this it can feed gadgets information they could use to provide advanced AR functionality including skyboxes out a window, things walking in through the door, movies playing on televisions, pictures in the frames, awareness of where the chair in the room is, and so on. Developers would be able to add new contexts in their gadgets, allowing users to label an occlusion zone with a that context, allowing them to expand the scope of this use case. Other gadgets can also use these occlusion zones for collision for objects.

This gadget thus gives other gadgets a foundation for handling occlusion and environmental awareness, essential components of AR, without machine vision.

Both of these modes assume a way to center the player’s “tracing” reliably in an environment and so should allow for an easy marker to be made to anchor and align a set of occlusion zones.

This gadget would also work over 3D passthough in order to add occlusion and context to the users room which is already obviously roomscale. Occlusion cones would also be able to be used for a related function by another gadget: defining a spatial zone of 3D camera passthough in order to bring a real world object into VR, or to do the reverse and add a virtual object into the AR passthrough. This would be integrating with "Chells," and gadget with that purpose.

The gadget could communicate with someone else’s gadget in order to render for them an approximation of the scene/room the primary user is in. Occlusion zones could be relied on as a base in order for a game to allow a player to “trace in” a table or other surface to use in the game. Photogrammetry, pre rendered, to create a more full telepresence, is outside the scope of this gadget.

After some testing in both AR and VR, Aardvark gadgets appear to always render over VR programs, when it comes to the users in game hands, this is a major problem and can make interacting with gadgets harder and even headache inducing. Occlusion zones built around player hands may be an important part of using Aardvark comfortably. This may require custom profiles for games, covering the range of motion and orientation of the player's hands in that software, as well as a generic profile.

Who would use this gadget?

This gadget adds advanced AR functionalities that are usually the product of machine vision to aardvark. This eliminates a great deal of the gap that aardvark has from not having access to machine vision in either VR or AR. Developers intending to design for AR principles and the most likely AR technology who expect to be able to design around the aspects of AR provided by machine vision now are able to utilize occlusion and context awareness in their design and prototypes, as they would expect. Especially when used for 3D passthough, this makes Aardvark an excellent choice and PCVR well beyond most competition..

It also allows consumers to make use of these same tools for AR games or utilities, or in fully roomscale VR games. They could use in social functions or games with others, but for them it would be a standard VR object.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Creating the foundation for this gadget to feed other gadgets this information in a manner they can understand in order to be occluded properly, or gain awareness of context, seems to be the most difficult aspect.

JoeLudwig commented 3 years ago

The Aardvark way of building this would be to allow any gadget to contribute "occlusion meshes" to the system that write to the z-buffer in a pre-pass before any of the opaque geometry is drawn. That would prevent anything else from drawing inside of those meshes without the other gadgets needing to be aware of the occlusion meshes at all. This part of things is mainly a graphics programming task down in the rendering code (and a bit of plumbing.) That's not particularly difficult work.

That leaves the not-trivial part of this, which a gadget should be well suited for: editing the meshes live inside VR. That would be a pretty neat project to knock out next weekend.

Adil3tr commented 3 years ago

@JoeLudwig Do you envision Aaardvark as feeding information to games as well? For example, I spoke to a developer who showed interest in enabling his existing game to work over the 3D pass-through (it's a static puzzle game similar to the puzzles in Alyx). Assuming he can get that working over the pass-through could his, or anyone else's, game be aware of these meshes as well? Or would it only be something that could work on other Aardvark gadgets?