Open joaovbs96 opened 1 year ago
Hi @joaovbs96 ,
Unfortunately, you can't specify a source geometry to render with. There's only a built-in 2d quad which is used for both rendering and providing geometric attributes to the node being baked. The plane is mostly fixed with a few transform options but will not take as input a texture coordinate set from a given geometry -- i.e. it's not an atlas creator.
Seems worth considering adding this, even if as a first step to improve the texture coordinate set input.
If it's possible to post the material / geometry, or just the requirements for each that would help to define what how to support what you want. ( There may be a way to script this outside the baker but would require more information. )
Thanks.
Hi @kwokcb, thanks for replying and sorry for the time it took me to get back to you!
I managed to get an example material where we see an issue. If we use, for example, the default marble material of the MaterialX node editor, and the attenuation dragon from Khronos' GLTF sample models as geometry, this is what we get:
If we, then, bake the material with the script, simply doing python .\baketextures.py [input] [output]
, and loading the baked material in the viewer, this is what we get:
And, ideally we want to have the same look with the original and baked materials. The only way I can imagine that a procedural material like this would work correctly with the baker script is if we are somehow able to provide a geometry with the appropriate UVs for the baking, so that an atlas can be generated for it.
Files used for tests above: marble_test.zip
Thanks for the example @joaovbs96. In this case part of the mismatch is due to 3d positions (as it's a 3d texture) but if the uv range differs between the input geometry and the sample plane issues will also arise.
The other interesting related issue is if input geometry is object vs world space then I don't think baking handles this properly.
So for now until these are addressed the configurations of baking that should work are 2d procedurals, but for the range case it's probably worth exposing all of the bake parameters in the baketextures.py
script as it should be handled with an existing option. (though you can script these options yourself).
I'll leave a ping to @ashwinbhat and @jstone-lucasfilm for additional insight as they have done more for this than I. (The only geometric dependence I see is for UDIM handling).
Hi! Just bumping here to check if @ashwinbhat or @jstone-lucasfilm could add any further insight or information about the texture baking process as a whole 🙂 Thanks in advance!
Hi @joaovbs96!
You're correct that the MaterialX texture baker doesn't yet support graphs containing geometric nodes such as position
and normal
, and here's our current note describing this limitation:
We've given some thought to how this limitation might be removed in the future, and one option would be to prebake geometric data for the requested source mesh into UV-space textures (e.g. position
and normal
textures), and then to reference these UV-space textures in place of the geometric nodes when baking out the final material.
I don't believe any team has yet started engineering work on this improvement to MaterialX, so let us know if you might be interested!
@joaovbs96
As a follow-up to Slack for dev-days. Here's some possible sub-tasks. I think it's possible to work through these and see how much progress can be made.
Thanks @kwokcb ! I would love to work on this issue for the dev-days event 🙂
Just to clarify a few points:
Hi @joaovbs96
My feeling is that that the utility functions can be called outside the baking process and are use 1 or more "baking" steps to create intermediate images.
So I am assuming that node graphs for "baking" geometry would look something like this:
geomtetry_node -> unlit_shader node -> surfacematerial node
The "bake" itself would be a variant on the existing "bake quad", but instead of routing the fixed 4 corner points, that it needs the rough texcoords as positions. I was thinking this could just be set up to use the regular 3d rendering calls with an ortho camera but it could be that a specialized version of these is needed for position stream routing.
Hi! I'm trying my hands on the baking capabilities of MaterialX but a couple things are not completely clear. How could I load, with Python, a specific model/mesh to bake the materials? Can that be done at all?
E.g. if I have a procedural material that somehow depends on the geometry to look correct and I want to bake the material down to a texture atlas.