This repository provides the Fourier Latent Dynamics (FLD) algorithm that represents high-dimension, long-horizon, highly nonlinear, period or quasi-period data in a continuously parameterized space. This work demonstrates its representation and generation capability with a robotic motion tracking task on MIT Humanoid using NVIDIA Isaac Gym.
Paper: FLD: Fourier Latent Dynamics for Structured Motion Representation and Learning
Project website: https://sites.google.com/view/iclr2024-fld/home
Maintainer: Chenhao Li
Affiliation: Biomimetic Robotics Lab, Massachusetts Institute of Technology
Contact: chenhli@mit.edu
Create a new python virtual environment with python 3.8
Install pytorch 1.10
with cuda-11.3
pip3 install torch==1.10.0+cu113 torchvision==0.11.1+cu113 torchaudio==0.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
Install Isaac Gym
Download and install Isaac Gym Preview 4
cd isaacgym/python
pip install -e .
Try running an example
cd examples
python 1080_balls_of_solitude.py
For troubleshooting, check docs in isaacgym/docs/index.html
Install humanoid_gym
git clone https://github.com/mit-biomimetics/fld.git
cd fld
pip install -e .
resources/robots/mit_humanoid/datasets/misc
. 10 trajectories of 240 frames for each motion are stored in a separate .pt
file with the format motion_data_<motion_name>.pt
. The state dimension indices are specified in reference_state_idx_dict.json
under resources/robots/mit_humanoid/datasets/misc
.mit_humanoid.py
and a config file mit_humanoid_config.py
under humanoid_gym/envs/mit_humanoid/
. The config file sets both the environment parameters in class MITHumanoidFlatCfg
and the training parameters in class MITHumanoidFlatCfgPPO
.python scripts/fld/experiment.py
history_horizon
denotes the window size of the input data. A good practice is to set it such that it contains at least one period of the motion.forecast_horizon
denotes the number of future steps to predict while maintaining the quasi-constant latent parameterization. For motions with high aperiodicity, this value should be set small. It falls back to PAE when forecast_horizon
is set to 1.logs/<experiment_name>/fld/misc/
. The figures include the FLD loss, the reconstruction of sampled trajectories for each motion, the latent parameters in each latent channel along sampled trajectories for each motion with the formed latent manifold, and the latent parameter distribution.logs/<experiment_name>/fld/misc/model_<iteration>.pt
, where <experiment_name>
is defined in the experiment config.tensorboard --logdir logs/<experiment_name>/fld/misc/ --samples_per_plugin images=100
to visualize the training loss and plots.statistics.pt
file is saved in the same folder, containing the mean and standard deviation of the input data and the statistics of the latent parameterization space. This file is used to normalize the input data and to define plotting ranges during policy training.python scripts/fld/evaluate.py
latent_params.pt
file is saved in the same folder, containing the latent parameters of the input data. This file is used to define the input data for policy training with the offline task sampler.gmm.pt
file is saved in the same folder, containing the Gaussian Mixture Model (GMM) of the latent parameters. This file is used to define the input data distribution for policy training with the offline gmm task sampler.resources/robots/mit_humanoid/datasets/decoded/motion_data.pt
. Figure 1 shows the latent sample and the reconstructed motion trajectory. Figure 2 shows the sampled latent parameters. Figure 3 shows the latent manifold of the sampled trajectory, along with the original ones. Figure 4 shows the GMM of the latent parameters.python scripts/fld/preview.py
motion_file
to the corresponding motion file.PLAY_LOADED_DATA
to False
. The modified latent parameters are then decoded to the original motion space and visualized.python scripts/train.py --task mit_humanoid
humanoid_gym/envs/mit_humanoid/mit_humanoid_config.py
.MITHumanoidFlatCfgPPO.runner.task_sampler_class_name
to OfflineSampler
, GMMSampler
, RandomSampler
or ALPGMMSampler
.logs/<experiment_name>/<date_time>_<run_name>/model_<iteration>.pt
, where <experiment_name>
and <run_name>
are defined in the train config.--headless
.python scripts/play.py --load_run "<date_time>_<run_name>"
load_run
and checkpoint
in the train config.datasets_root
. These motions are first encoded to the latent space and then sent to the policy for execution.dynamics_error
.RuntimeError: nvrtc: error: invalid value for --gpu-architecture (-arch)
@torch.jit.script
in isaacgym/python/isaacgym/torch_utils.py
.The ALPGMMSampler
utilizes faiss for efficient similarity search and clustering of dense vectors in the latent parameterization space. The installation of faiss
requires a compatible CUDA version. The current implementation is tested with faiss-cpu
and faiss-gpu
with cuda-10.2
.
@article{li2024fld,
title={FLD: Fourier Latent Dynamics for Structured Motion Representation and Learning},
author={Li, Chenhao and Stanger-Jones, Elijah and Heim, Steve and Kim, Sangbae},
journal={arXiv preprint arXiv:2402.13820},
year={2024}
}
The code is built upon the open-sourced Periodic Autoencoder (PAE) Implementation, Isaac Gym Environments for Legged Robots and the PPO implementation. We refer to the original repositories for more details.