Closed megapatato closed 1 year ago
That's interesting. Currently, the alpha attributes are used to set the material's "Viewport Display" "Blend Mode" settings. I hadn't thought about using shader nodes. Is this better than the current mechanism in some way?
What does the "Attribute" node in your examples do?
Attribute node directly takes vertex/edge/face attribute/value by name. Color, custom normal, UV map and so on. That way you can colorize/gloss/rough mesh without any image texture and bake them after all on to actual images. Or not... depends on task. I've used that technique in my engine with PBR shaders. Works much faster and weight less size. All data sits in VBO, with no need to load and multisample images. 65k triangles =~ 36k vertices * (1 rgb color + 3 byte color masks) =~ 216 kb, while images is much more heavier and loads to GPU much longer. You can even "bake" palette/LUT and some kind of noise textures to FBO right in GPU memory with cheap amount of triangles. With help of geometry shader amount of solutions are endless.
As podvornyakva comments above, the Attribute node is an input node, getting metadadata from vertices. In this case, the "Col" data (case sensitive).
Bethesda uses vertex colors as well as vertex alphas in some Skyrim pieces, for example the Cave Green or Riften Ratway tilesets, or the daedric chest armor (the vapor thing on the neck). The combination of these two sources of information yields the final appearance in-game. Without vertex information, the Blender material can only use the texture's alpha; idem for colors.
This technique allows the Creation engine to use two independent sources of information:
1) In the the DaedricCuirass_x.nif files, the TorsoGlow objects use vertex alpha to make the vapors "fade out" as they approach the edges, regardless of the alpha channel on the texture itself (or any "palette to alpha" transform also anchored to the texture) 2) In CaveGreen files, vertex colors are used to darken and green-cast regions where different BSTriShapes meet, easing texture blending and mimicking some form of ambient occlusion 3) In my own work for the Hamster Way, I am using both texture alphas as well as vertex alphas for specific "fade to black" transition pieces
Without the information of the vertex colors or alphas, none of these effects are visible.
N.B. the "greater than" node on the alpha testing diagram is not needed as you're already set the blend mode to clipping, so the diagram is the same as for alpha blending or testing; I'll edit the top comment to reflect this.
Mind you, I'm not really confident I understand Blender's handling of alpha--clipping vs blending in the Viewport Display settings doesn't behave the way I'd expect.
Currently I'm all snarled up trying to get pynifly to export bone pose locations but I'll come back to this when that's done.
I have a NIF file that uses alpha blending & clipping on different pieces, and behaves as expected in game (and NifSkope). I can send that to you if it's useful as a test asset.
I'll see if I can tweak the plugin's code, and if I get something working I'll open a PR.
One consideration here is that the vertex color/alpha info seems to be used differently by different shaders. The Skyrim hair shader, for example, uses vertex color (not alpha) to decide how much of the hair color should get applied to different parts of the mesh. The head shaders use vertex alpha combined with the texture's alpha so the same bit of texture can be transparent or not depending on which part of the mesh is using it. I don't know if vert color is used at all there.
That daedric armor points out another problem--I don't deal with BSEffectShaderProperty at all.
If I'm understanding correctly:
BSEffectShaderProperty is used extensively for animations; all the hand-art for spells, their inventory art, the enchant art, they're all using these for U & V offsets, for alpha changes, for specularity changes... I went deep into several of these properties for Local Resurrect, and have some documentation on that Description page, but getting any of that to work will require also dealing properly with NiControllerManager and NiMultiTargetController
I looked at the Controller nodes and I don't know if I'm going there. I figure if I can export something you can copy and paste with nifskope that might be good enough. If manipulating the nodes is just as complex in Blender as it is in nifskope, I'm not sure there's an advantage.
But EffectShaderProperty should be supported.
As for hair color, I'm not too worried about that--it's provided by the game engine at run time. It uses the Overlay method of combining, I think. I could put a color node in for it, but that's not a big win.
I lied, BSEffectShaderProperty is supported. Has been for a while.
Implemented in latest.
When importing objects that contain vertex colors & alphas, and these are used in their BSLightingShaderProperty in flag groups 2 & 1 respectively, the initial shader created by the plugin does not account for that vertex data.
alpha testing
If the object has alpha-testing, this is generated:
I would suggest instead this be generated:alpha blending
If the object has alpha-blending, this is generated:
suggestion
I would suggest instead this be generated:
The attribute node is an input node, that reads metadata from vertices. In this case, the "Col" specifies the RGBA data. The color component can be fed to a mixRGB node that multiplies the texture's color. For alpha, a math node that multiplies the two values can be used.