MANO (original) (raw)

Embodied Hands:

Modeling and Capturing Hands and Bodies Together

Javier Romero*, Dimitrios Tzionas* and Michael J Black

SIGGRAPH ASIA 2017, BANGKOK, THAILAND

Abstract

Humans move their hands and bodies together to communicate and solve tasks. Capturing and replicating such coordinated activity is critical for virtual characters that behave realistically. Surprisingly, most methods treat the 3D modeling and tracking of bodies and hands separately. Here we formulate a model of hands and bodies interacting together and fit it to full-body 4D sequences. When scanning or capturing the full body in 3D, hands are small and often partially occluded, making their shape and pose hard to recover. To cope with low-resolution, occlusion, and noise, we develop a new model called MANO (hand Model with Articulated and Non-rigid defOrmations). MANO is learned from around 1000 high-resolution 3D scans of hands of 31 subjects in a wide variety of hand poses. The model is realistic, low-dimensional, captures non-rigid shape changes with pose, is compatible with standard graphics packages, and can fit any human hand. MANO provides a compact mapping from hand poses to pose blend shape corrections and a linear manifold of pose synergies. We attach MANO to a standard parameterized 3D body shape model (SMPL), resulting in a fully articulated body and hand model (SMPL+H). We illustrate SMPL+H by fitting complex, natural, activities of subjects captured with a 4D scanner. The fitting is fully automatic and results in full body models that move naturally with detailed hand motions and a realism not seen before in full body performance capture. The models and data are freely available for research purposes in our website (http://mano.is.tue.mpg.de).

Video

More Information

News

Model/Code Versions:
v1.0 - 22 Nov 2017 - Initial model release. The shape space was not scaled.
v1.1 - 14 May 2018 - MANO PKL files changed after scaling the shape space (shapedirs) for unit variance.
v1.2 - 16 Jan 2019 - No change in any model file. Some bugs in the code were solved and a 3D viewer was added for visualization.

Nov 2021: We now support conversion between all the models in the SMPL family, i.e. SMPL, SMPL+H, SMPL-X. https://github.com/vchoutas/smplx/tree/master/transfer_model

Nov 2019: We uploaded MANO fits for the HCI dataset of Tzionas et al. IJCV'16, used in Hasson et al. ICCV'19.

Referencing the Dataset

Here are the Bibtex snippets for citing MPI MANO in your work.

@article{MANO:SIGGRAPHASIA:2017, title = {Embodied Hands: Modeling and Capturing Hands and Bodies Together}, author = {Romero, Javier and Tzionas, Dimitrios and Black, Michael J.}, journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)}, volume = {36}, number = {6}, series = {245:1--245:17}, month = nov, year = {2017}, month_numeric = {11} }

Papers that build on MANO

Below is a non-exclusive list (continously updated) of papers that build on MANO - This is a MANO-focused list. A much more exhaustive list is the awesome hand pose estimation list on github (with which we would not like to compete in any way).

In case we have missed your work, please feel free to contact us, with a brief description of your work (2-3 sentences, e.g. input, output, and nugget).

Please take into account that we add only works that:

Inference: RGB-to-MANO (regression based)

Inference: RGB-to-MANO (optimization based)

Datasets that use MANO

Expressive human models that employ MANO & SMPL+H

Extending MANO with an appearance model