LucaScheller / VFX-UsdAssetResolver

Usd Asset Resolver Reference Implementations
https://lucascheller.github.io/VFX-UsdAssetResolver/
Apache License 2.0
105 stars 21 forks source link

Refresh / Prune the caching pairs in Houdini #12

Closed dovanbel closed 2 months ago

dovanbel commented 6 months ago

Hi,

I'm using the CachedResolver in our pipe, and I'm starting to test it. We are using ShotGrid. Our identifiers are shotgrid templates with fields that are converted by the PythonExpose module to actual filepaths. This works as expected, which is really great. Thanks a lot for this plugin.

Note that I'm not using a push-pipeline. I'm keeping the caching pairs to their initial state, until the artist decides to pull the latest version in.

I'm hitting two issues.

1/ In houdini, I have a hard time triggering a refresh of the context. What I mean is that if I have a caching pair such as identifier=assetA: /some/path/assetA_v001.usd and if I then change it to : identifier=assetA: /some/path/assetA_v002.usd

The change is not reflected in Houdini, even if I do a recook(force=True) on all the sublayer/reference nodes and even if I do a cached_resolver.RefreshContext(context_collection). The only thing that works is bypassing the node(s) and re-enabling it(them). I know this is a Houdini specific question, but if you have a better solution to force a refresh of the context I would be glad to hear about it.

2/ Another issue I have is how to 'prune' the context. Let's say I have two sublayer nodes, A and B. I'll get a context with two caching pairs. If I delete the sublayer node B, I would like to prune the caching pairs so that only the caching pair for node A is kept in the context, without reevaluating this caching pair (because I want it to stay pinned). This is a very simple example but in reality with references inside references inside references this becomes a bit more complex. I think in this case I should store the current caching pairs, then clear all caching pairs, force a re-creation of the context, and then compare the old caching pairs dictionary to the new caching pairs. I could then remove the unused caching pairs while keeping the state of the used caching pairs. I then hit the issue 1/, I'm unable to find a proper way to refresh the context. Maybe in a future version version of the CachedResolver plugin you could add a pruning mechanism ?

Regards

LucaScheller commented 6 months ago

Hey @dovanbel, thanks for checking out the plugin :) Great to hear that it is being battle tested in a production env. To answer you questions:

  1. I don't know a better way, maybe you can file a SideFX bug report and report back the answer here? It'd be interesting to hear if it has to do with how Houdini handles stages per node or if it is a USD specific problem. (Maybe first see if this doesn't happen in USDView first.)
  2. So USD stores all the layers in a "Layer Registry", so it is a kind of "Singleton" pattern for storing what layers have ever been openend. The resolver itself is not stage aware, it just takes identifiers and resolvers them (and uses the context to guide the resolution). So any layer that you open (in any way, so just a pxr.Sdf.Layer.FindOrOpen() call too), makes a call to the resolver. There isn't really a conncept for "This layer is different because I loaded it on a different Houdini node". So it is up to the user to "prune" the context of layers the user doesn't see as relevant anymore. A context can be re-used by many stages and I don't think there is a method to probe what stages still use it other than iterating through open stages (which in Houdini could be any node) and checking their context. The solution for now would be to call: node.stage().GetUsedLayers() and then prune all the identifiers that are not in those layers. This way you don't have to trigger a refresh.

Cheers, Luca

dovanbel commented 6 months ago

Thanks @LucaScheller. Will try to get the answer from Side Effects or from my own further tests in Houdini

dovanbel commented 6 months ago

Hi,

FYI, I tested the cached resolver in usdview (the usdview that comes bundled with Houdini) and can confirm that updating the caching pairs, then doing a stage.Reload() kicks a redraw of it's viewport and shows the updated scene. In other words, it works as expected in usdview.

I noticed closing the scene viewport in Houdini and recreating a scene viewport does the trick... Anyway I'll contact Side Effects to know how I should properly reload the stage in Houdini after a modification of the context of the Resolver

NB : thanks @LucaScheller for the hint of using node.stage().GetUsedLayers(). That pointed me in the right direction and I now have a working solution for pruning the caching pairs

dovanbel commented 5 months ago

Hi @LucaScheller

I sent a bug report to Side Effects

The gist of my bug report : I create a sublayer node. The file path of that node use my custom identifier . Behind the scenes, the cachedResolver resolves the identifier to an actual file path.

I then add a python lop :

from pxr import Ar, Usd, Vt
from usdAssetResolver import CachedResolver

node = hou.pwd()
stage = node.editableStage()
cached_resolver = Ar.GetUnderlyingResolver()
context_collection = stage.GetPathResolverContext()
cachedResolver_context = context_collection.Get()[0]

identifier = "template=usd_asset_layer&sg_asset_type=Prop&Asset=Apocalypse&Step=Model&task_name=mymod&extension=usd"
path = "//server01/shared2/projects/donatdev2024/assets/Prop/Apocalypse/usd/layers/Model/Apocalypse_mymod_v006.usd"
cachedResolver_context.AddCachingPair(identifier, path)
cached_resolver.RefreshContext(context_collection)

Nothing happens in the viewport of Houdini

I then add these lines to the end of the python lop :

hou.node('/stage/Apocalypse_Model_mymod').cook(force=True)
hou.node('/stage/Apocalypse_Model_mymod').parm("reload").pressButton()
hou.lop.reloadLayer('/stage/Apocalypse_Model_mymod', recursive = True)

Still no change in the viewport

Two things do work :

And I told side effects that in USDView this works fine. After doing the AddCachingPair() I do a stage.Reload() and I see the update

The answer from Side Effects :

It is completely unsafe to be making this kind of stage level change using the Python LOP. Because you are not doing this through proper LOP node channels, the LOP system has no idea you're doing it, and so it can't correct for it. In this particular case, the problem is caused by the fact that the viewport has its own copy of the stage generated by LOP nodes, and Houdini keeps the two stages "in sync". The proper LOP-based way to do this is with the Configure Stage LOP. I would recommend putting the Configure Stage at the top of your LOP node chain so that when you make changes, all your nodes are forced to recook. But in theory using a Configure Stage part way down your LOP network should work (though I've never tried it so I wouldn't be too surprised if something goes wrong). The problem with having the Configure Stage part way down the chain is that if you move the display flag below, then above, then below the configure Stage, the stage should recompose based on the changing Ar context each time. Maybe this is what you want? Seems to me like it would be confusing...

Anyway, I can't guarantee that this will "just work", so please reactivate this if you have any issues. But absolutely don't ever try to make this kind of stage level manipulation with a python LOP.

I replied that I would need a hou python command to tell Houdini that even though the file path did not change in the sublayer node, pretend it did and refresh the stage

dovanbel commented 5 months ago

I got a follow up answer from the support of Side effects.

Have you considered using an asset resolver context object associated with the stage rather than using global configuration that is expected to simultaneously affect all stages? Using this approach you could avoid the need to "dirty everything everywhere" when you make a change. This is the pattern that is supported by LOPs through the use of "asset resolver context" parameters on the configure Stage LOP and LOP Network nodes. But regardless, I see what you're saying now. I was misunderstanding how you were communicating context information to your resolver.

The only thing I can suggest that would actually make this kind of set up safe would be to mark every LOP node in your hip file dirty whenever you make this sort of change. Or at least every Sublayer, Reference, Load Layer, Configure Stage LOP - anything that might be impacted by a change to the asset resolver. A call to houo.Node.forceRecook() should do it? Because you really do need to re-run all these nodes to get them to reload all external USD files from disk.

We do already have an outstanding RFE that I see as slightly similar to this - if a USD file on disk is modified from outside the Houdini process, there is no easy way to tell Houdini to "recook everything that might be loading this file on disk". So again, the only solution for this that works right now is basically marking every node that might load a file from disk as dirty.

Sorry, forceRecook isn't a HOM method... But I think you can call hou.LopNode.cook(force=True).

I will try this solution next week

LucaScheller commented 5 months ago

Thanks for the follow up and sharing the infos!

dovanbel commented 5 months ago

The nice developers of Houdini added a new hou.lop method in daily build 20.0.688 :

hou.lop.forceReloadAllFilesFromDisk(reload_viewports=False)

dev notes : Added hou.lop.forceReloadAllFilesFromDisk which will invalidate all USD stages authored by LOPs and reload them. This reloads all USD data, and also allows the asset resolver to re-resolve all assets.

The reload_viewports boolean flag can be used to force a refresh of the viewports.

Tested and working on my side.

Regards

LucaScheller commented 5 months ago

Awesome, thanks for the update :) I guess we can mark this as resolved then?

LucaScheller commented 2 months ago

Closing as resolved.