Accompanying PyTorch code for the paper "3D Human Pose Estimation with 2D Marginal Heatmaps".
Requirements:
docker-compose.yml.example
to docker-compose.yml
.At this stage docker-compose.yml
will contain example volume mounts for the datasets.
You will need to edit the entries for datasets that you have prepared,
and remove the others.
For example, if you wish to use the MPI-INF-3DHP dataset, you must replace /host/path/to/mpi3d
with the actual path to the prepared MPI-INF-3DHP data on your computer.
You only need to prepare the datasets that you are interested in using.
docker-compose.yml
so that the absolute location of
the processed/
directory created by h36m-fetch is bound to /datasets/h36m
inside the Docker container.src/margipose/bin/preprocess_mpi3d.py
script to preprocess the data.docker-compose.yml
so that the absolute location of
the processed MPI-INF-3DHP data is bound to /datasets/mpi3d
inside the Docker container.docker-compose.yml
so that the desired installation directory
for the MPII Human Pose dataset is bound to /datasets/mpii
inside the Docker container.$ ./run.sh bash
$ chmod 777 -R /datasets/mpii
$ python
>>> from torchdata import mpii
>>> mpii.install_mpii_dataset('/datasets/mpii')
Showoff is a display server which allows you to visualise model training progression. The following steps guide you through starting a Showoff server and configuring MargiPose to use it.
POSTGRES_PASSWORD
in showoff/postgres.env
. Using a randomly generated password is
recommended.COOKIE_SECRET
in showoff/showoff.env
. Once again, using a randomly generated
value is recommended.docker-compose up -d showoff
. This will
start the Showoff server.showoff/showoff-client.env
in a text editor.showoff-client.env
(you will need to uncomment the appropriate lines).A run.sh
launcher script is provided, which will run any command within a Docker container
containing all of MargiPose's dependencies. Here are a few examples.
Train a MargiPose model on the MPI-INF-3DHP dataset:
./run.sh margipose train with margipose_model mpi3d
Train without pixel-wise loss term:
./run.sh margipose train with margipose_model mpi3d "model_desc={'settings': {'pixelwise_loss': None}}"
Evaluate a model's test set performance using the second GPU:
./run.sh margipose --device=cuda:1 eval --model margipose-mpi3d.pth --dataset mpi3d-test
Explore qualitative results with a GUI:
./run.sh margipose gui --model margipose-mpi3d.pth --dataset mpi3d-test
Run the project unit tests:
./run.sh pytest
Pretrained models are available for download:
You can try out the pretrained model like so:
./run.sh margipose infer --model margipose-mpi3d.pth --image resources/man_running.jpg
(C) 2018 Aiden Nibali
This project is open source under the terms of the Apache License 2.0.
If you use any part of this work in a research project, please cite the following paper:
@article{nibali2018margipose,
title={3D Human Pose Estimation with 2D Marginal Heatmaps},
author={Nibali, Aiden and He, Zhen and Morgan, Stuart and Prendergast, Luke},
journal={arXiv preprint arXiv:1806.01484},
year={2018}
}