Autodesk / maya-usd

A common USD (Universal Scene Description) plugin for Autodesk Maya
758 stars 202 forks source link

Making a ProxyShape a live surface is broken in VP2.0 #263

Open dj-mcg opened 4 years ago

dj-mcg commented 4 years ago

Describe the bug No geometry is created when trying to use a create tool with a ProxyShape that is made live when using Viewport 2.0.

Steps to reproduce Steps to reproduce the behavior:

  1. Run the testProxyShapeLiveSurface. Observe the failed assert

Expected behavior The test passes with no errors.

Additional context This test is part of a suite of tests that are run internally at Pixar. They are not included run as part of the external Pixar usdMaya plugin tests since they are making assumptions about the runtime environment that may not be valid for all users. However, moving forward, these tests should be run internally within Autodesk to validate correctness when testing any changes or fixes to the usdMaya plugin.

kxl-adsk commented 4 years ago

This is because of https://github.com/Autodesk/maya-usd/blob/dev/lib/nodes/proxyShapeBase.cpp#L1013

@mattyjams Is code in https://github.com/Autodesk/maya-usd/blob/81aff08bec82e7323a4a2e6d69b2d6420306f3d5/lib/render/pxrUsdMayaGL/batchRenderer.cpp#L1105 dependent on a specific render delegate or can be shared by any? Maybe we could bring this down to the base class?

santosd commented 4 years ago

I was able to repro the issue in my own scene using Maya USD it currently doesn't support making a USD element a live surface so we will need to prioritize this one.

mattyjams commented 4 years ago

Hey @kxl-adsk, sorry for the extremely delayed response here, but I'm just starting to dig into this to try to unblock some internal user testing of the VP2.0 render delegate.

I could definitely bring down and/or refactor the closest point delegate into something that should work for both the Pixar batch renderer and the VP2.0 render delegate, but there's a tricky hitch for the latter.

Ultimately, we'll be executing a Hydra pick task using an HdEngine and the HdRenderIndex that the VP2.0 render delegate/MPxSubSceneOverride for the shape currently owns. As far as I can tell, the render delegate is not setting root paths on its HdRprimCollection, so we might get away without having to query it for that, and we can instead just use the default absolute root path. Whatever handles the closestPoint() callback could probably manage its own HdEngine, but I don't think we can get around needing the specific HdRenderIndex that the ProxyShapeDelegate currently owns.

We get the closestPoint() callback in the context of the shape itself though, so I'm not sure what would be the preferred way (if any way even exists) to get the MPxSubSceneOverride for a given shape so that we could cast it to a ProxyRenderDelegate and then call accessors to get theHdRenderIndex we need.

In the case of the Pixar batch renderer, we have access to the singleton UsdMayaGLBatchRenderer which can lookup the appropriate HdRprimCollection given a shape's DAG path, and can then run tasks for that collection against its own HdRenderIndex. There's no such singleton in the VP2.0 render delegate case, nor is there a single HdRenderIndex that's shared across all MPxSubSceneOverrides.

Do you have any insight as to what would be the preferred way to get the MPxSubSceneOverride for a shape from inside the closestPoint() callback of that shape? Or is there some other way you had in mind for accessing the structures we need to execute the pick task that I'm not thinking of?

fabal commented 3 years ago

Hey @mattyjams , @kxl-adsk would you have any thoughts on that one? I.e. any recommendation on how to get the structures needed by the pick task from the callback? We're about to try and implement something but it won't look pretty. A more robust way which could be contributed back is certainly preferable :)

kxl-adsk commented 3 years ago

We haven't done a ton of investigation on this one, but I can share my current thinking and where I would consider starting.

The (proxy) shape is responsible for providing closesPoint(...) implementation. This shape can be visualized at a time by one or more different renderers (VP2 with VP2RenderDelegates, all standalone Hydra render delegates via mtoh plugin, and legacy draw override approaches). This is why we can't really tight the implementation to a single drawing path.

I can see how proxy shape could delegate the responsibility to one of the drawing paths via a registration mechanism. We can't count on VP2 to help with providing hit test information, because it is all internal to the selection mechanism. So you would still need to go with the approach of leveraging its render index as described above.

We could as well consider making the proxy shape completely independent from drawing/consumers and provide closestPoint(...) implementation but I fear we don't have a performant way of implementing it. This is why I wouldn't start here.

Finally, does anyone know how Hydra implements hit test? Can it do a software hit test? (I would assume this would be needed for any other render delegates than Storm).

fabal commented 3 years ago

Thanks and yes, the idea is to implement a VP2RenderDelegate delegate for closestPoint. I still have the same questions @mattyjams asked above though, particularly:

Do you have any insight as to what would be the preferred way to get the MPxSubSceneOverride for a shape from inside the closestPoint() callback of that shape? Or is there some other way you had in mind for accessing the structures we need to execute the pick task that I'm not thinking of?

A pointer could be stored in a shared structure but I wonder if there is a better way to access the render index held in the SubSceneOverride.

mattyjams commented 3 years ago

Hey @fabal! Yeah, I started looking into this a while back, but we ended up determining that it may not be as useful as we thought. That was mainly due to the realization that the existing live surface functionality in Maya only allows setting a Maya shape as live. In the case of USD proxy shapes, the hierarchy and amount of geometry that shape represents could obviously be huge, so what we might want instead would be the ability to specify a particular UFE scene item as live. For our artists, that probably felt too granular, and they may want to make multiple items all live at the same time. It was increasingly feeling like live surface was not the mechanism we wanted to use.

In the case of the legacy Pixar batch renderer (and for picking in Storm in general), it's effectively a screen space-based solution. We invoke the renderer to generate ID, points, normal, and depth buffers over a small area around the screen space coordinate where you clicked, and we use those buffers to compute the data on the CPU that gets returned in the closesPoint() callback. This is where that happens in the Pixar batch renderer: https://github.com/Autodesk/maya-usd/blob/0b7a931b8149afacfafd1027cb655158ff400ebe/lib/mayaUsd/render/pxrUsdMayaGL/proxyShapeDelegate.cpp#L42 And this is roughly the equivalent in Hydra: https://github.com/PixarAnimationStudios/USD/blob/82dd40ad31bcdecfb18c2f23e39a3ddf68ef10d4/pxr/imaging/hdx/pickTask.cpp#L381

So we can't truly address the "closest point" aspect of that callback with these solutions if you were to click somewhere off the surface, but you should get something valid if you click somewhere on it. For our purposes, that's an acceptable tradeoff, as the most common use case for us is projecting or snapping things onto specific points on surfaces.

We dropped the priority on "real" live surface support for the VP2.0 render delegate when we realized we could use an existing offscreen renderer that we wrote a while ago. It's effectively doing the same thing, but through the Maya API instead. It's unfortunately not currently in a state where we could contribute it back, but here's roughly how it works:

Once we have the depth data, we can combine that with the camera matrices to unproject the depths and figure out positions, and we can also reconstruct estimated normals if we want them by creating a plane with nearby depths and computing the normal of that plane.

I was hoping to get some time to maybe take a crack at cleaning our mess of that up and see if I could get it in shape for a PR, but I don't see that happening any time soon. Hopefully that helps give you guys some ideas though?

fabal commented 3 years ago

Thanks for this @mattyjams It does give some ideas (funnily enough someone else here, AKA Aloys, came up with a similar suggestion yesterday). For us too, the main use case is snapping which means that I've hijacked this thread a little bit as we can get by with the same trade-off as you did. I think for now we will go with this approximate solution but definitely will keep an eye on the offscreen renderer solution.

kxl-adsk commented 3 years ago

To answer your question @fabal

A pointer could be stored in a shared structure but I wonder if there is a better way to access the render index held in the SubSceneOverride.

The use of the word delegate might have been confusing in my response. I didn't propose to put the logic leveraging render index into the proxy. My proposal was that proxy will ask a registered handler/delegate to perform the computation of the closest point. This way we don't couple the proxy with a particular drawing implementation.

@mattyjams you raised some great points - the usefulness of supporting live surface on the proxy shape level is highly dependent on what's in the proxy shape.