UPC-ViRVIG / pymotion

A Python library for working with motion data in numpy or PyTorch
MIT License
24 stars 1 forks source link
blender character-animation deep-learning dual-quaternion forward-kinematics motion numpy pytorch quaternion rotation-matrix skeleton

PyMotion: A Python Library for Motion Data

PyMotion is a Python library that provides various functions for manipulating and processing motion data in NumPy or PyTorch. It is designed to facilitate the development of neural networks for character animation.

Some features of PyMotion are:

Contents

  1. Installation
  2. Examples
  3. Roadmap
  4. License

Installation

  1. [Optional] Install PyTorch using Pip as instructed in their webpage.

  2. Install PyMotion:

    pip install upc-pymotion
  3. [Optional] Install Plotly and Dash for the visualizer:

    pip install upc-pymotion[viewer]

Examples

Read and save a BVH file ```python import numpy as np from pymotion.io.bvh import BVH bvh = BVH() bvh.load("test.bvh") print(bvh.data["names"]) # Example Output: ['Hips', 'LeftHip', 'LeftKnee', 'LeftAnkle', 'LeftToe', 'RightHip', 'RightKnee', 'RightAnkle', 'RightToe', 'Chest', 'Chest3', 'Chest4', 'Neck', 'Head', 'LeftCollar', 'LeftShoulder', 'LeftElbow', 'LeftWrist', 'RightCollar', 'RightShoulder', 'RightElbow', 'RightWrist'] # Move root joint to (0, 0, 0) local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data() local_positions[:, 0, :] = np.zeros((local_positions.shape[0], 3)) bvh.set_data(local_rotations, local_positions) # Scale the skeleton bvh.set_scale(0.75) bvh.save("test_out.bvh") ```
Compute world positions and rotations from a BVH file
**NumPy** ```python from pymotion.io.bvh import BVH from pymotion.ops.forward_kinematics import fk bvh = BVH() bvh.load("test.bvh") local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data() global_positions = local_positions[:, 0, :] # root joint pos, rotmats = fk(local_rotations, global_positions, offsets, parents) ``` **PyTorch** ```python from pymotion.io.bvh import BVH from pymotion.ops.forward_kinematics_torch import fk import torch bvh = BVH() bvh.load("test.bvh") local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data() global_positions = local_positions[:, 0, :] # root joint pos, rotmats = fk( torch.from_numpy(local_rotations), torch.from_numpy(global_positions), torch.from_numpy(offsets), torch.from_numpy(parents), ) ```
Quaternion conversion to other representations
**NumPy** ```python import pymotion.rotations.quat as quat import numpy as np angles = np.array([np.pi / 2, np.pi, np.pi / 4])[..., np.newaxis] # angles.shape = [3, 1] axes = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]]) # axes.shape = [3, 3] q = quat.from_angle_axis(angles, axes) rotmats = quat.to_matrix(q) euler = quat.to_euler(q, np.array([["x", "y", "z"], ["z", "y", "x"], ["y", "z", "x"]])) euler_degrees = np.degrees(euler) scaled_axis = quat.to_scaled_angle_axis(q) ``` **PyTorch** ```python import pymotion.rotations.quat_torch as quat import numpy as np import torch angles = torch.Tensor([torch.pi / 2, torch.pi, torch.pi / 4]).unsqueeze(-1) # angles.shape = [3, 1] axes = torch.Tensor([[1, 0, 0], [0, 1, 0], [0, 0, 1]]) # axes.shape = [3, 3] q = quat.from_angle_axis(angles, axes) rotmats = quat.to_matrix(q) euler = quat.to_euler(q, np.array([["x", "y", "z"], ["z", "y", "x"], ["y", "z", "x"]])) euler_degrees = torch.rad2deg(euler) scaled_axis = quat.to_scaled_angle_axis(q) ```
Root-centered dual quaternions from a BVH file
**NumPy** ```python from pymotion.io.bvh import BVH import pymotion.ops.skeleton as sk import numpy as np bvh = BVH() bvh.load("test.bvh") local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data() root_dual_quats = sk.to_root_dual_quat( local_rotations, local_positions[:, 0, :], parents, offsets ) local_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents) global_positions = local_translations[:, 0, :] offsets = local_translations.copy() offsets[:, 0, :] = np.zeros((offsets.shape[0], 3)) ``` **PyTorch** ```python from pymotion.io.bvh import BVH import pymotion.ops.skeleton_torch as sk import torch bvh = BVH() bvh.load("test.bvh") local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data() root_dual_quats = sk.to_root_dual_quat( torch.from_numpy(local_rotations), torch.from_numpy(local_positions[:, 0, :]), torch.from_numpy(parents), torch.from_numpy(offsets), ) local_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents) global_positions = local_translations[:, 0, :] offsets = local_translations.clone() offsets[:, 0, :] = torch.zeros((offsets.shape[0], 3)) ```
6D representation from a BVH file
**NumPy** ```python from pymotion.io.bvh import BVH import pymotion.rotations.ortho6d as sixd bvh = BVH() bvh.load("test.bvh") local_rotations, _, _, _, _, _ = bvh.get_data() continuous = sixd.from_quat(local_rotations) local_rotations = sixd.to_quat(continuous) ``` **PyTorch** ```python from pymotion.io.bvh import BVH import pymotion.rotations.ortho6d_torch as sixd import torch bvh = BVH() bvh.load("test.bvh") local_rotations, _, _, _, _, _ = bvh.get_data() continuous = sixd.from_quat(torch.from_numpy(local_rotations)) local_rotations = sixd.to_quat(continuous) ```
Visualize motion in Python
```python from pymotion.render.viewer import Viewer from pymotion.io.bvh import BVH from pymotion.ops.forward_kinematics import fk bvh = BVH() bvh.load("test.bvh") local_rotations, local_positions, parents, offsets, _, _ = bvh.get_data() global_positions = local_positions[:, 0, :] # root joint pos, rotmats = fk(local_rotations, global_positions, offsets, parents) viewer = Viewer(use_reloader=True, xy_size=5) viewer.add_skeleton(pos, parents) # add additional info using add_sphere(...) and/or add_line(...), examples: # viewer.add_sphere(sphere_pos, color="green") # viewer.add_line(start_pos, end_pos, color="green") viewer.add_floor() viewer.run() ```
Visualize a pose in Blender
1. Open the Test Exitor window in Blender 2. Open the the file ```blender/pymotion_blender.py``` that can be found in this repository 3. Run the script (Blender will freeze) ![Blender script image](docs/img/blender_script.png) 4. Run the following Python code in a seperate environment: ```python from pymotion.io.bvh import BVH from pymotion.ops.forward_kinematics import fk from pymotion.visualizer.blender import BlenderConnection bvh = BVH() bvh.load("test.bvh") local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data() global_positions = local_positions[:, 0, :] # root joint pos, _ = fk(local_rotations, global_positions, offsets, parents) # Render points frame = 0 conn = BlenderConnection("127.0.0.1", 2222) conn.render_points(pos[0]) conn.close() ``` 5. Press ESC key in Blender to stop the server

Roadmap

This repository is authored and maintained by Jose Luis Ponton as part of his Ph.D.

Features will be added when new operations or rotation representations are needed in the development of research projects. Here it is a list of possible features and improvements for the future:

License

This work is licensed under the MIT license. Please, see the LICENSE for further details.