facebookresearch / habitat-lab

A modular high-level library to train embodied AI agents across a variety of tasks and environments.
https://aihabitat.org/
MIT License
1.81k stars 459 forks source link

Question about importing captured motion data in `.bvh` #1987

Open zoeyliu1999 opened 1 week ago

zoeyliu1999 commented 1 week ago

Habitat-Lab and Habitat-Sim versions

Habitat-Lab: master

Habitat-Sim: master

Habitat is under active development, and we advise users to restrict themselves to stable releases. Are you using the latest release versions of Habitat-Lab and Habitat-Sim? Your question may already be addressed in the latest versions. We may also not be able to help with problems in earlier versions because they sometimes lack the more verbose logging needed for debugging.

Master branch contains 'bleeding edge' code and should be used at your own risk.

Docs and Tutorials

Did you read the docs? https://aihabitat.org/docs/habitat-lab/

Did you check out the tutorials? https://aihabitat.org/tutorial/2020/

Perhaps your question is answered there. If not, carry on!

❓ Questions and Help

Hey there,

Thank you for your excellent work!

I am trying to import captured motion data into Habitat 3.0. Last time, I successfully transferred .npz to .pkl using the script, and it worked great! However, I am now working with motion data in the .bvh format, and I'm struggling to convert it to .npz or .pkl.

Could you please provide some advice or guidance on how to perform it? I am relatively new to this, and any help would be greatly appreciated. Below is the method I used to try to transfer .bvh to .npz:

import numpy as np
from bvh import Bvh
from scipy.spatial.transform import Rotation as R
import math
from tqdm import tqdm

JOINT_MAP = {
    # 'BVH joint name': 'SMPLX joint index'
    "Hips": 0,
    "LeftUpLeg": 1,
    "RightUpLeg": 2,
    "Spine": 3,
    # and so on ......
}

# generate root_orient, trans, poses
def bvh_to_smplx(bvh_file, n_frames=None, downsample_factor=6):
    """
    params bvh_file: path to the BVH file
    params n_frames: optional, number of frames to process
    params downsample_factor: optional, every n-th frame to process
    """
    with open(bvh_file, "r") as f:
        mocap = Bvh(f.read())

    if n_frames is None:
        num_frames = len(mocap.frames)
    else:
        num_frames = min(n_frames, len(mocap.frames))

    num_frames_downsampled = math.ceil(num_frames / downsample_factor)

    smplx_poses = np.zeros((num_frames_downsampled, 165))
    smplx_trans = np.zeros((num_frames_downsampled, 3))
    smplx_root_orient = np.zeros((num_frames_downsampled, 3))

    rotation_correction = R.from_euler("XYZ", [90, 0, 0], degrees=True)
    for i in tqdm(range(0, num_frames, downsample_factor)):
        for joint_name, joint_index in JOINT_MAP.items():

            rotation = R.from_euler(
                "XYZ",
                mocap.frame_joint_channels(
                    i, joint_name, ["Xrotation", "Yrotation", "Zrotation"]
                ),
                degrees=True,
            )

            if joint_name == "Hips":
                rotation = rotation_correction * rotation

            smplx_poses[
                i // downsample_factor, 3 * joint_index : 3 * (joint_index + 1)
            ] = rotation.as_rotvec()

            if joint_name == "Hips":
                x, y, z = mocap.frame_joint_channels(
                    i, joint_name, ["Xposition", "Yposition", "Zposition"]
                )
                smplx_trans[i // downsample_factor] = np.array([x, -z, y])
                smplx_root_orient[i // downsample_factor] = (
                    rotation.as_rotvec()
                )

    return smplx_trans, smplx_poses, smplx_root_orient

# save as .npz
def save_npz(
    output_file,
    smplx_trans,
    smplx_poses,
    smplx_root_orient,
    gender="neutral",
    model_type="smplx",
    frame_rate=30,
):
    np.savez(
        output_file,
        root_orient=smplx_root_orient,
        trans=smplx_trans,
        poses=smplx_poses,
        gender=gender,
        surface_model_type=model_type,
        mocap_frame_rate=frame_rate,
        betas=np.zeros(16),
    )
xavierpuigf commented 1 week ago

I am not very familiar with bvh but this looks good. What is failing? Is it that the motion is looking strange or importing crashes? If something in the motion is off, try to see first if the translation looks correct, then root orientation and finally the poses. The avatar in T-shape should have a rotation of 0 in their joints so you can try to debug that position?