Perceiving Systems, Computer Vision

Learning joint reconstruction of hands and manipulated objects

2019-06-14


Estimating hand-object manipulation is essential for interpreting and imitating human actions. Previous work has made significant progress towards reconstruction of hand poses and object shapes in isolation. Yet, reconstructing hands and objects during manipulation is a more challenging task due to significant occlusions of both the hand and object. While presenting challenges, manipulations may also simplify the problem since the physics of contact restricts the space of valid hand-object configurations. For example, during manipulation, the hand and object should be in contact but not interpenetrate. In this work we regularize the joint reconstruction of hands and objects with manipulation constraints. We provide an end-to-end learnable model that exploits a novel contact loss that favors physically plausible hand-object constellations. To train and evaluate the model, we also provide a new large-scale synthetic dataset, ObMan, with hand-object manipulations. Our approach significantly improves grasp quality metrics over baselines on synthetic and real datasets, using RGB images as input.

Code, data, models for our CVPR 2019 paper. https://ps.is.tuebingen.mpg.de/publications/hasson-cvpr-2019

Author(s): Yana Hasson and Gül Varol and Dimitrios Tzionas and Igor Kalevatykh and Michael J. Black and Ivan Laptev and Cordelia Schmid
Department(s): Perceiving Systems
Research Projects(s): Hands-Object Interaction
Publication(s): Learning Joint Reconstruction of Hands and Manipulated Objects
Authors: Yana Hasson and Gül Varol and Dimitrios Tzionas and Igor Kalevatykh and Michael J. Black and Ivan Laptev and Cordelia Schmid
Release Date: 2019-06-14
External Link: https://hassony2.github.io/obman.html