Moguri / panda3d-gltf

glTF utilities for Panda3D
BSD 3-Clause "New" or "Revised" License
81 stars 19 forks source link

Joints seem Y-up when exposed in Panda3D #106

Open Epihaius opened 2 years ago

Epihaius commented 2 years ago

Hi,

When trying to attach a model to an exposed joint of an armature loaded from glTF (exported from Blender version 2.93), I found that the joint seemed to be Y-up rather than Z-up.

To explain, my way of determining the transform of an object in Blender relative to a certain bone (which will become an exposed joint in Panda) is as follows:

Then I export just the object to be attached and the empty to glTF. When loading this glTF in Panda, I search both objects and call node_to_attach.get_transform(empty) to obtain the joint-relative transform of the object I want to attach to this joint. As this didn't give expected results when assigning this transform to the object to attach, I found after some experimentation that it does work when, in Blender, I rotate the empty -90 degrees about its local X-axis (so Y becomes Z and Z becomes -Y).

This seems fairly nonintuitive to me, so I was wondering if there is some glTF import option that allows changing the exported joint coordinate system or if such an option can be implemented?

(If there is something obvious that I'm missing, please let me know, as I'm still quite new to animating with Blender.)

rdb commented 2 years ago

If I recall correctly, this is because the Blender glTF exporter converts to Y-up by simply adding a 90° rotation on the root joint rather than converting all the matrices involved.

panda3d-gltf could theoretically detect this and apply this root joint transform via PartBundle::xform().

rdb commented 2 years ago

It could be that I am understanding this issue incorrectly. It would help to have an example model we could look at to better demonstrate the problem.

Do note that there is no concept of a "direction" for joints in Panda. There is no guarantee that the Y or Z axes of the joint's local coordinate system will be aligned with the bone direction in Blender, since it is not consequential for skinning.

Epihaius commented 2 years ago

The goal is to be able to parent an object to a joint, such that when this joint is animated, that object moves along with it. As is showcased in Panda's "Looking and gripping" sample.

Let's say we're developing a 1st-person shooter game, where we want a gun to follow the movements of the right hand. Skinning the gun to the armature is not an option, since we want the ability to swap out the gun with other types of weapons in real-time. So instead, we're going to figure out the best placement of the weapon in relation to a specific "hand bone" (such that this relative placement can be applied in Panda). Now where would we do that? In Panda, using lots of trial and error using code? That doesn't seem very practical. Rather, we would do that in a modelling application like Blender.

Now there are actually two problems:

  1. In Blender, there doesn't seem to be a way to get transformation values of an object relative to another one (at least none that I could find).

  2. In Panda, we can expose a "hand joint", but if we don't know the coordinate system in which to express position and angle offsets from the weapon model to the joint, we're still getting nowhere.

As a workaround for issue 1., we can align an object (e.g. an empty) to the bone -- as described in the original post -- and export it to glTF together with another object that represents the weapon (making sure it has the desired placement relative to the bone). In Panda, we can then get the transform of the weapon relative to the "bone proxy" (the empty), both loaded from the glTF. But no matter how we try to determine that offset transform, if we don't know what coordinate space the exposed joint in Panda will have, it's going to remain guesswork.

So what I'd like to know for now is, can we at least reliably assume that an exposed joint will be Y-up?

To be clear, the option I asked for to configure the joint's coordinate system is simply to obviate the need for rotating the "bone proxy" -90 degrees about its local X-axis in Blender. But that's really not such a big deal, as long as I know that I can do that in a consistent way.

It would help to have an example model we could look at to better demonstrate the problem.

The .zip file attached below contains two .blend files -- one leading to good results (character_test_good.blend) and one leading to bad results (character_test_bad.blend) -- together with the corresponding exported .gltf files:

exposed_joint_coord_sys.zip

The .zip also contains this code sample:

from panda3d.core import *
from direct.showbase.ShowBase import ShowBase
from direct.actor.Actor import Actor

class Game(ShowBase):

    def __init__(self):
        ShowBase.__init__(self)

        self.disable_mouse()

        # set up a light source
        p_light = PointLight("point_light")
        p_light.set_color((1., 1., 1., 1.))
        self.light = self.camera.attach_new_node(p_light)
        self.light.set_pos(5., -100., 7.)
        self.render.set_light(self.light)

        self.camera.set_pos(-20., 20., 10.)
        self.camera.look_at(0., 5., -5.)

        model = self.loader.load_model("character_test_good.gltf", noCache=True)
        gun = model.find("**/gun")
        hand_bone_proxy = model.find("**/hand_bone_proxy")
        self.actor = Actor(model, copy=False)
        self.actor.reparent_to(self.render)
        self.actor.node().set_bounds(OmniBoundingVolume())
        self.actor.node().set_final(True)
        hand_joint = self.actor.expose_joint(None, "modelRoot", "hand.R")
        gun_xform = gun.get_transform(hand_bone_proxy)
        gun.set_transform(gun_xform)
        gun.reparent_to(hand_joint)
        self.actor.loop("aim")

game = Game()
game.run()

If you run that sample as-is, you should see the weapon model follow the hand as the "aim" animation loops:

character_test_good_panda

But if you replace character_test_good.gltf with character_test_bad.gltf in the code, you should see this:

character_test_bad_panda

That's the result of simply aligning the empty with the hand bone in Blender:

character_test_bad

So I need that additional rotation to adjust the orientation of the empty to this:

character_test_good

If this last step will remain necessary, it should definitely be mentioned somewhere in a forum post/tutorial or in the manual, since this is likely to trip up Panda game developers.

rdb commented 2 years ago

If I use blend2bam to load character_test_bad.blend, it does work correctly. So it is exclusively a problem with the conversion to Y-up and back (which is suppressed by blend2bam).

The Blender glTF exporter doesn't really do a coordinate space conversion for the joint hierarchy, it's lazy and just adds a 90° rotation on the root joint (torso):

        bundle = self.actor.find("**/+Character").node().get_bundle(0)
        root_joint = bundle.children[0].children[0]
        print(root_joint.get_transform())

This is applied here: https://github.com/KhronosGroup/glTF-Blender-IO/blob/3e617f0e7c4bc1fb176c4e70155f99eced27d2cc/addons/io_scene_gltf2/blender/exp/gltf2_blender_gather_joints.py#L33-L45

Certainly, panda3d-gltf could do the conversion back to Z-up the same way that Blender has done the conversion to Y-up; rather than retransforming all the joints, it could instead just add a 90° rotation to the root joint. This would fix this issue reliably but it might no longer produce the right results for assets authored in Y-up. I am not sure we can actually reliably detect this automatically either, since it might be impossible to distinguish a genuine transformation that was applied to the torso joint in Blender and this 90° conversion rotation applied by the glTF exporter.

A better solution might be to submit a PR to the Blender glTF exporter to do a coordinate space conversion for all the joint transforms and animations rather than just plopping a 90° one onto the root.

Epihaius commented 2 years ago

If I use blend2bam to load character_test_bad.blend, it does work correctly. So it is exclusively a problem with the conversion to Y-up and back (which is suppressed by blend2bam).

Ah, that is good news! Thanks!

One small problem is that for asset creation for the Panda3D Tech Demo, we kinda settled on using .gltf, so I'm not sure if the other team members would appreciate switching to .bam files now. But if .bam files have no disadvantages compared to glTF (e.g. if there is no more risk that they become incompatible with future Panda versions), we could still consider the switch to .bam I guess.

A better solution might be to submit a PR to the Blender glTF exporter to do a coordinate space conversion for all the joint transforms and animations rather than just plopping a 90° one onto the root.

Fair enough. But then I hope someone else will take this on, as I myself don't know enough about glTF -- or Blender, for that matter -- to create such a PR or give valid argumentation when asked to justify that request. :)

rdb commented 2 years ago

There are valid reasons to prefer gltf over bam, and I personally still think we ought to get this bug fixed for the gltf conversion path.

Moguri commented 1 year ago

I wonder if we just narrowly apply this fix based on the asset property. In other words, if we detect the Blender glTF exporter and do the root node transform in that case?