Yana Hasson, Gül Varol, Dimitris Tzionas, Igor Kalevatykh, Michael J. Black, Ivan Laptev, Cordelia Schmid, CVPR 2019
This code allows to generate synthetic images of hands holding objects as in the ObMan dataset.
In addition, hands-only images can also be generated, with hand-poses sampled randomly from the MANO hand pose space.
Examples of rendered images:
Hands+Objects | Hands |
---|---|
Rendering generates:
For additional information about the project, see:
wget https://download.blender.org/release/Blender2.78/blender-2.78c-linux-glibc219-x86_64.tar.bz2
for instance)tar -xvf blender-2.78c-linux-glibc219-x86_64.tar.bz2
wget https://bootstrap.pypa.io/get-pip.py
blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/python3.5m get-pip.py
path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/python3.5m path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/lib/python3.5/ensurepip
path/to/blender-2.78c-linux-gliblender-2.78c-linux-glibc219-x86_64/2.78/python/bin/pip3 install --upgrade pip
path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/pip install -r requirements.txt
git clone https://github.com/hassony2/obman_render
cd obman_render
cd download
sh download_smpl_data.sh ../assets username password
cd ..
export MANO_LOCATION=/path/to/mano_v*_*
print 'FINITO'
at the end of file webuser/smpl_handpca_wrapper.py
(line 144)- print 'FINITO'
import cPickle as pickle
by import pickle
- import cPickle as pickle
+ import pickle
webuser/smpl_handpca_wrapper.py
(line 23)webuser/serialization.py
(line 30)
webuser/smpl_handpca_wrapper.py
(line 74)- smpl_data = pickle.load(open(fname_or_dict))
+ smpl_data = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
webuser/serialization.py
(line 90)- dd = pickle.load(open(fname_or_dict))
+ dd = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
webuser/smpl_handpca_wrapper.py
(line 81-84)- with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
- hand_l = load(f)
- with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
- hand_r = load(f)
+ with open('/path/to/mano_v*_*/models/MANO_LEFT.pkl', 'rb') as f:
+ hand_l = load(f, encoding='latin1')
+ with open('/path/to/mano_v*_*/models/MANO_RIGHT.pkl', 'rb') as f:
+ hand_r = load(f, encoding='latin1')
At the time of writing the instructions mano version is 1.2 so use
- with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
- hand_l = load(f)
- with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
- hand_r = load(f)
+ with open('/path/to/mano_v1_2/models/MANO_LEFT.pkl', 'rb') as f:
+ hand_l = load(f, encoding='latin1')
+ with open('/path/to/mano_v1_2/models/MANO_RIGHT.pkl', 'rb') as f:
+ hand_r = load(f, encoding='latin1')
SMPL for Python users
, copy the models
folder to assets/models
. Note that all code and data from this download falls under the SMPL license.Download LSUN dataset following the instructions.
Request data on the ObMan webpage
Download grasp and texture zips
You should receive two links that will allow you to download bodywithands.zip
and shapenet_grasps.zip
.
cd assets/textures
mv path/to/downloaded/bodywithands.zip .
unzip bodywithands.zip
cd ../..
cd assets/grasps
mv path/to/downloaded/shapenet_grasps.zip .
unzip shapenet_grasps.zip
cd ../../
obman_render/
assets/
models/
SMPLH_female.pkl
basicModel_f_lbs_10_207_0_v1.0.2.fbx'
basicModel_m_lbs_10_207_0_v1.0.2.fbx'
...
grasps/
shapenet_grasps/
shapenet_grasps_splits.csv
SURREAL/
smpl_data/
smpl_data.npz
...
path/to/blender -noaudio -t 1 -P blender_grasps_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["white"]}'
path/to/blender -noaudio -t 1 -P blender_hands_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["white"]}'
path/to/blender -noaudio -t 1 -P blender_hands_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["lsun", "imagenet"], "imagenet_path": "/path/to/imagenet", "lsun_path": "/path/to/lsun"}'
path/to/blender -noaudio -t 1 -P blender_grasps_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["lsun", "imagenet"], "imagenet_path": "/path/to/imagenet", "lsun_path": "/path/to/lsun"}'
If you find this code useful for your research, consider citing:
@INPROCEEDINGS{hasson19_obman,
title = {Learning joint reconstruction of hands and manipulated objects},
author = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
booktitle = {CVPR},
year = {2019}
}
@INPROCEEDINGS{varol17_surreal,
title = {Learning from Synthetic Humans},
author = {Varol, G{\"u}l and Romero, Javier and Martin, Xavier and Mahmood, Naureen and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
booktitle = {CVPR},
year = {2017}
}
@article{MANO:SIGGRAPHASIA:2017,
title = {Embodied Hands: Modeling and Capturing Hands and Bodies Together},
author = {Romero, Javier and Tzionas, Dimitrios and Black, Michael J.},
journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
publisher = {ACM},
month = nov,
year = {2017},
url = {http://doi.acm.org/10.1145/3130800.3130883},
month_numeric = {11}
}