Closed Hustfreshman closed 1 month ago
Hi, thanks a lot of reporting this. I've found the issue and will commit the fix soon.
For now, if you can replace the axis_angle_to_matrix
implementation in umetrack_skinning.py
with the following the problem should be fixed:
def axis_angle_to_matrix(axis_angle: torch.Tensor) -> torch.Tensor:
leading_dims = axis_angle.shape[:-1]
axis_angle = axis_angle.reshape(-1, 3)
theta = torch.norm(axis_angle, p=2, dim=-1)
out = torch.zeros((theta.shape[0], 3, 3), dtype=theta.dtype, device=theta.device)
for i in range(3):
out[:, i, i] = 1
small_angle = theta < 1e-6
theta = theta[~small_angle]
axis_angle = axis_angle[~small_angle]
axis = axis_angle / theta[..., None]
c = torch.cos(theta)
s = torch.sin(theta)
kx = axis[..., 0]
ky = axis[..., 1]
kz = axis[..., 2]
kxky = kx * ky
kxkz = kx * kz
kykz = ky * kz
kx2 = kx * kx
ky2 = ky * ky
kz2 = kz * kz
o = torch.stack(
(
c + kx2 * (1 - c),
kxky * (1 - c) - kz * s,
kxkz * (1 - c) + ky * s,
kxky * (1 - c) + kz * s,
c + ky2 * (1 - c),
kykz * (1 - c) - kx * s,
kxkz * (1 - c) - ky * s,
kykz * (1 - c) + kx * s,
c + kz2 * (1 - c),
),
-1,
)
out[~small_angle] = o.reshape(-1, 3, 3)
return out.reshape(*leading_dims, 3, 3)
Thank you for organizing the Hands competition. In order to verify the model's performance, we have split a portion of the training data to use as a validation set. The data processing for the validation set was done following the steps outlined in the Local validation section, which resulted in the file gt_landmarks_umetrack.json. In this file, some frames have "landmarks as NaN". For example: `"sequence_name": "subject_030_synthetic_separate_hand_002136", "frame_id": 184, "landmarks": [ [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ], [ NaN, NaN, NaN ] ] },
Here are the corresponding Umetrack model parameters: { "sequence_name": "subject_030_synthetic_separate_hand_002136", "frame_id": 185, "mano_theta": [ -1.2225896120071411, 0.36279019713401794, 0.14448513090610504, 0.4162368178367615, 1.6039888858795166, -1.1659281253814697, 0.22714950144290924, 2.463958740234375, 0.3210441768169403, 0.02948339655995369, -0.21838001906871796, -0.37585991621017456, 0.2640269100666046, 0.15194766223430634, -0.08119188994169235 ], "wrist_xform": [ -0.12045539170503616, -0.26378971338272095, 0.2230101078748703, -0.16797541081905365, 0.08163973689079285, 0.052317336201667786 ], "hand_side": 0 },`
When performing the evaluation, the presence of NaN values causes the evaluation metrics to also become NaN. Could you please clarify what might be causing this issue? Thanks for your response.