jnnan / trumans_utils

Repository of TRUMANS
120 stars 6 forks source link

Scaling Up Dynamic Human-Scene Interaction Modeling

This is the code repository of Scaling Up Dynamic Human-Scene Interaction Modeling at CVPR24 (highlight)

arXiv | Project Page | Dataset | Demo

News

https://github.com/user-attachments/assets/3a510469-0146-4e14-a259-54366168001a

Human Motion Synthesis in Editable Indoor Scenes

This project provides an implementation of our method for synthesizing human motion in an indoor scene, based on a user-defined trajectory. The furniture configuration within the scene can be edited freely by the user. This is achieved by using a Flask application to facilitate the interactive input and visualization.

Features

Getting Started

Prerequisites

To run the application, you need to have the following installed:

Installation

  1. Clone the Repository:

    git clone https://github.com/jnnan/trumans_utils.git
    cd trumans_utils
  2. Download Checkpoints, Data, and SMPL-X Models:

    • Download the necessary files and folders from this link.
    • Extract trumans_demo.zip, and place the four folders at the root of the project directory (./trumans_utils).
  3. Install Python Packages:

    pip install -r requirements.txt

Running the Application

To start the Flask application:

python3 -m flask run --host=0.0.0.0

The application will be available at http://127.0.0.1:5000.

Usage

  1. Open your web browser and navigate to http://127.0.0.1:5000.
  2. You will see an interface where you can edit the indoor scene configuration.
  3. Draw a trajectory within the scene.
  4. The application will synthesize human motion based on the drawn trajectory and display it within the scene.

Training

Overview

This README provides instructions on setting up and training the TRUMANS model using the provided dataset.

Prerequisites

Before you begin, make sure you have the following software installed:

    pip install -r requirements.txt

Dataset Setup

  1. Download the TRUMANS dataset from the provided link.
  2. Place the dataset files in the following directory within your project:
    ./trumans/Data_release

Configuration

Set the ROOT_DIR environment variable to the absolute path of the ./trumans directory in your system. This can be done by adding the following line to your .bashrc or .bash_profile:

export ROOT_DIR='/absolute/path/to/trumans'

Make sure to replace /absolute/path/to/trumans with the actual path to the trumans folder on your machine.

Model Training

Navigate to the trumans directory:

cd trumans

To start training the model, run the training script from the command line:

python train_synhsi.py

The training script will automatically load the dataset from Data_release, set up the model, and commence training sessions using the configurations in ./trumans/config folder.

Annotation

The dataset includes an action_label.npy file containing frame-wise annotations for the motions. The labels correspond to the type of interaction and are indexed as follows:

Interaction Type Label
Lie down 0
Squat 1
Mouse 2
Keyboard 3
Laptop 4
Phone 5
Book 6
Bottle 7
Pen 8
Vase 9

These labels are used during the training to provide supervision for the learning model. To visualize the generated motion according to the predefined labels, see the demo part and change line 81 of sample_hsi.py

TRUMANS Dataset

Please download the TRUMANS dataset from Google Drive. The content inside the download link will be continuously updated to ensure you have access to the most recent data.

Explanation of the files and folders of the TRUMANS dataset:

Note: The data associated with action labels and 2D rendering will be uploaded soon.

Citation

@inproceedings{jiang2024scaling,
  title={Scaling up dynamic human-scene interaction modeling},
  author={Jiang, Nan and Zhang, Zhiyuan and Li, Hongjie and Ma, Xiaoxuan and Wang, Zan and Chen, Yixin and Liu, Tengyu and Zhu, Yixin and Huang, Siyuan},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={1737--1747},
  year={2024}
}