Open galvedro opened 1 year ago
I believe the vertex_normals collection is now the correct (non readonly) place to modify the normals. try the following replacement:
for i, vertex_normal in enumerate(me.vertex_normals):
vertex_normal = normals[i]
To remain backward compatible you need to do a version check and call this in the new or the old version in 2.8 blender!
Yes, that snippet works and, as you say, it breaks backwards compatibility with 2.80, so it needs to be handled.
But I am confused about whether or not this needs to be written by hand (I just don't know enough about 3D). Blender's documentation says:
vertex_normals - The normal direction of each vertex, defined as the average of the surrounding face normals. Type bpy_prop_collection of MeshNormalValue, (readonly)
If each vertex' normal is a function of the normals of the surrounding faces, doesn't Blender calculate them by itself, as described in the pull request I liked above, the one that made normals
readonly in the first place?
EDIT: Also, just noticed that the documented type says it's readonly, even though your snippet does not throw any exception...
You are correct about the documentation! hadn't spotted that one, Odd that it does not except.
Interested to know if your removal of the lines results in import of the normals from the OBJ or are they just recalculated later?
EDIT: As the code is immediately followed by a call to me.calc_normals()
I suspect that the setting of the normals from the file is redundant anyway?
How are the normals imported by wavefront obj importer? Maybe there is a clue on how it should be done in xp2b. It produces better results than xp2b.
Wavefront and fbx appear to set the normals via the mesh loops (split normals)
wavefront: which goes on to use me.normals_split_custom_set
and fbx which is setting them directly onto the me.loops.
there also seems to be a me.normals_split_custom_set_from_vertices()
which may do what the x-plane code is trying to achieve taking a per-vertex normal set and applying it as 'split normals' see this discussion and the documentation: normals_split_custom_set_from_vertices
But calling me.calc_normals
, and possibly me.update(calc_edges=True)
, immediately after would presumably undo this work?
The wavefront code specifically stores the 'temp' normals before calling validate
and me.update
using the temp values to call me.normals_split_custom_set()
afterward.
I just pushed a change. I added auto_smooth
, set the normals with normals_split_custom_set_from_vertices()
, and removed calc_normals()
, as described in the post you linked. It does the trick, and it works in both 2.80 and 3.3.1.
I can see that the shading with which an OBJ is exported is respected when importing it again.
Certainly an improvement! I made the smoothing an importer option in my implementation. There are some areas where the imported mesh still needs a little cleaning up. This panel from the C90B ends up with multiple normals on a number of the vertices. and looses the sharp edges, but a manual tidy up is possible! But the smoothing works well on the switches and other naturally smooth items. detail of the corner:
My feeling is that the order should be:
me.update(calc_edges=True)
me.normals_split_custom_set_from_vertices(normals)
(As per the wavefront importer)
rather than
me.normals_split_custom_set_from_vertices(normals)
me.update(calc_edges=True)
Not entirely sure thtat the update calc_edges doesn't modify the normals again?
I do not see different results by changing the order. I have flipped them though, to be on the safe side.
One thing about smoothing. If I am understanding this right, auto_smooth
has to be enabled in the object for Blender to do something with the custom normals, otherwise it won't use them and will do flat shading instead.
Agree, I made it an option as some objects like the one above import better if flat shaded, that panel for instance shows rounded edges and wrinkled surface if imported smooth, but looks correct if imported flat. But on the whole importing everything smooth and hand fixing them after looks to be the better route. The fact that we are making a choice for all objects in a file, tends to make smooth the better bet.
What we could do with is a simple automated way to recognise and mark the sharp edges from the imported normals.
Interestingly the export code in xplane_mesh.py looks at the smooth flag (on the face) and either exports using the split normals or the face normals.
for tmp_face in tmp_faces:
# To reverse the winding order for X-Plane from CCW to CW,
# we iterate backwards through the mesh data structures
for i in reversed(range(0, 3)):
index = tmp_face.indices[i]
vertex = xplane_helpers.vec_b_to_x(mesh.vertices[index].co)
normal = xplane_helpers.vec_b_to_x(
tmp_face.split_normals[i]
if tmp_face.original_face.use_smooth
else tmp_face.normal
)
I did some tests yesterday with the exporter, and something is fishy. Exporting a beveled cube to which I applied a Weighted Normal
modifier results in an obj that is identical to the one I get with no modifier. That suggests that the exporter is not picking the correct normals to export in certain cases.
It does pick up the difference between an object with smooth vs flat shading, but not the more detailed normal edits.
EDIT: The Weighted Normals
case, however exports correctly if I force the code in xplane_mesh.py
to write the split normals. So, I think the use_smooth
flag is picking only the case where the entire object is marked as smooth shading, and it is missing the detailed normal edits that can be done with auto_smooth
.
which normals are you changing, vertex or face? if you export non-smooth it will ignore changes to the vertex normals and use the face ones.
I think you are correct, as far as I can see the split normals are correct in all cases, can't quite see why it chooses the alternatives.
flat shading the split normals are following the face normals smooth shading (object) the split normals average the two faces
Auto-smooth is the same as flat in the default case
I agree. Changing the exporter to always use split normals gets this test scene exported correctly, and the importer also gets it right back into Blender. The scene has three objects, one with smooth shading, one with flat shading, and one with auto_shading and a Weighted Normal modifier applied.
This is the imported result:
It does look like the exporter could just use the split normals. Could this have been an issue in an older version of Blender? I am testing with 3.3.1 at the moment...
I spotted the behaviour in the exporter earlier and thought it was odd to choose not to use the split normals always. I have been using 2.8.3 for my testing as I had hit the animation keyframe issue you had reported (Thanks for the fix by the way!!!)
It would be worth running a python test over the meshes you have there and look at the face.smooth flag and the object smooth flag, see how they are set in the auto-smooth case.
The mesh on the left looks like this:
mesh.use_auto_smooth = True
mesh.polygons[i].use_smooth = False # for all faces
The mesh in the middle, with flat shading:
mesh.use_auto_smooth = False
mesh.polygons[i].use_smooth = False # for all faces
The mesh on the right, with smooth shading:
mesh.use_auto_smooth = False
mesh.polygons[i].use_smooth = True # for all faces
I don't see an object-wide property for smooth shading.
So first two would use face normals and the only the third will use split normals in the original code.
I think the "always use split normals" sounds like the right way to go here?
I can try it out in 2.8.3, and 2.9 tomorrow see what the results are there
[Don't see any reason why it would be different though!]
Exactly, that's why the more advanced vertex normal edits are not exporting correctly.
Not sure how you want to handle the changes. If you want I can push the change to the exporter in this branch and maybe rescope the PR to something like "Fix: vertex normal handling" or something like that.
That's up to the devs, I am just an interested party like yourself, trying to get a working toolset! But the rescope sounds like it would make sense to me!
I'll apply the changes in my fork and try it out.
😀 Now I understand why you're playing with your fork instead of merging my PRs! Lol!
Yeah, with the latest tweaks here it's looking pretty good. From the default import, normals are looking much cleaner!:
The fuel panel is coming in with sharp edges and without surface defects, without manual modification!
I think it's now correctly representing the normals from the imported file!
and thats in 3.3.1!
I Just need to check the export isn't now bloating the files again. my fix to apply smoothing to all the faces reduced the large number of similar normals at the vertices.
Ok export file size is better than the 4.2.0-alpha original code but not as small as my 'all surfaces smooth'
| 1.46MB | POINT_COUNTS 12936 0 0 111594 | original file as shipped in XP12
| 1.23MB | POINT_COUNTS 10618 0 0 111108 | my export (all faces marked smooth in test_creation_helpers create_object)
| 3.13MB | POINT_COUNTS 42384 0 0 111108 | export from this discussion (mesh marked auto-smooth)
| 9.97MB | POINT_COUNTS 111594 0 0 111594 | original importer/exporter from io_xplane2blender_4_2_0-alpha
| 9.57MB | POINT_COUNTS 105606 0 0 111594 | original importer/exporter with Optimisation on
So significantly better than the old code (3*), but produces 42384 unique vertex/normal pairs rather than the 10618 that my smooth all surfaces ends up with. (But my version requires manual intervention on certain sharp areas like that flat panel)
NOTE: I don't fully understand why this produces 42384 rather than the 12936 of the original as it should now be using the same data! I may try and look into this.
Overall it's a trade off on each file, I think for now I am going to implement both 'smoothing' options in my version adding them as separate import options.
Note my version also has the global optimisation fix I proposed so your figures may differ! I am also triming the decimal digits on the export of vertex data which also reduces the file size closer to the original.
from: test_creation_helpers.py (but probably belongs on the end of the mesh buld code in xplane_imp_cmd_builder.py)
def create_object(name, mesh):
ob = bpy.data.objects.new(name, object_data=mesh)
# check if the smoothing operation is requested in the scene properties
if bpy.context.scene.xplane.smooth:
# set the mesh faces to smooth by default
for face in ob.data.polygons:
face.use_smooth = True
return ob
and from xplane_imp_cmd_builder.py
def build_mesh(self, vt_table: "VTTable", material_name: str) -> bpy.types.Mesh:
...
me = bpy.data.meshes.new(self.name)
# check if the smoothing operation is requested in the scene properties
if bpy.context.scene.xplane.auto_smooth:
me.use_auto_smooth = True
me.from_pydata(py_vertices, [], py_faces)
if me.validate(verbose=True):
logger.error(f"Invalid Mesh for object '{self.name}' was corrected, check console for more")
I don't fully understand why this produces 42384 rather than the 12936 of the original as it should now be using the same data! I may try and look into this.
Maybe not. I suspect something like this might be happening:
use_smooth
, you are telling Blender to discard split normals and keep only one normal per vertex.understand why mine is reducing the count, yes the smoothing reduces the number of split vertices. (My gut feel is that the original file was mostly marked smooth, largely comprised of toggle switches which look better smooth. And this is probably the reason my round-trip ends up with numbers similar to the original)
What I am not quite sure of is why the solution here increases the number of normals by a factor of 4 over the original file, when it should now be applying the set of split-normals we just imported! My only thought is that the validate called in the importer is resulting in the additional normals. and may be down to rounding errors introduced during the validation/fixing process.
I am still seeing many discrete normals at the coincidence points of suposedly flat faces for instance this one:
What I want to try and do, is isolate these in the original file, and see how many verts are there for that location.
Still the result here is better than the original code which was increasing the number of normals by a factor of nearly 10!
when it should now be applying the set of split-normals we just imported
The call to
me.normals_split_custom_set_from_vertices(normals)
is splitting the single normal per vertex that is coming from the OBJ into multiple normals per vertex. It could be that if the faces sharing that vertex are not dead on coplanar, the split normals may deviate from the original normal in the file, giving us a bouquet of normals like the one in your screenshot.
It's something like that, I think the only way to tell is to trawl the input file, but with 111000 points it's going to take me a while to isolate the right ones :-)
I get the feeling that a smarter export optimisation that considers similarity (close angular match of the split normal vectors) rather than the current "exact match" of the normals would resolve the issue at the expense of a little speed on the export.
Great discussion here! I don't know a thing about python and blender plugins and I am not sure I understand what you are talking about, but a while ago I made an obj8 importer for armor paint in haxe and I got some understanding on the obj8 format itself. The first thing my code does while importing is deoptimize obj8, if it was ever optimized! Any optimization on the mesh goes away and then the import can proceed. I import the normals as they are just scaling and transforming the way armor paint mesh data expects them. The mesh imports perfectly! Armor paint can then export to wavefront obj and then if I improt that wavefront in blender it is almost perfect. The wavefront importer creates custom normals data with sharp edges and smooth shading, but it looks almost perfect. So I wander why not you just parse the obj8 into vertex data and than use the wavefront importer functions to make blender mesh out of it? Here is the base of my importer if it can help you in any way https://github.com/todirbg/armorbase/blob/a44bed8813c90114b56a55612c5982d88768e5f4/Sources/arm/format/Obj8Parser.hx It splits the mesh into objects including some of the animation data and then does some transformations with it to make each object appear in its "rest pose".
Cheers, I'll take a look see what we can learn!
I get the feeling that a smarter export optimisation that considers similarity (close angular match of the split normal vectors) rather than the current "exact match" of the normals would resolve the issue at the expense of a little speed on the export.
It smells like that, yep.
Hey Gang,
First, thank you for the PR and for taking a look at this.
I think the main thing that I need to talk with the internal art team about is whether truly ad-hoc normals are something we should support.
The case in favor of them would be perfect round trip IO with the importer/exporter, but that's kind of a weird goal - we view the importer as a lifeboat to get old content authored in other 3-d tools that the community has heavily relied upon (Blender 2.49, AC3D) into modern Blender with animations intact to protect investment in modeling.
But in this case, the intent of the original normals might be lost, and a "split all the time so we can be perfectly like the old file" eats a pretty big cost (a multiplier on totla vertex count and a loss of reuse, as well as smooth-shading artifacts).
It looked to me like the intent in Blender 3.x is to have normals be a derived, calculated quantity that Blender computes and we consume, hence a MeshVertex's normal is no longer writable. If this is the case, are we petting the cat backward by managing our noramls by hand?
cheers Ben
This PR has been rescoped, see below for the original description.
This PR addresses three issues:
I attach a test
.blend
file (Blender 2.80) with which these issues can be reproduced: normals_test.blend.zipWith the changes in this PR, the test scene goes through the export/import process with no noticeable change in shading. The only visible change is the triangulation of quad faces.
All automated tests are green in
master
.feat_importer
has one failure that is already in the branch (i.e. not introduced by this PR).Original PR description
Trying to import basic geometry is failing in Blender 3.3.1 LTS. The script casts an exception saying that MeshVertex's
normal
property is readonly.Sure enough, the property has become readonly. Moreover, writing to it was not doing much before the change either, apparently. See this discussion: Make vertex normal property readonly
This PR removes the offending lines. I have tested to import a number of objects of varying complexity in Blender 2.80 and Blender 3.3.1 LTS, including objects with normals pointing inwards. I have not noticed any problem with the change.
The discussion I linked above also talks about
calc_normals()
. I have not touched it, but you may want to take a look.