Perceiving Systems, Computer Vision

LEAP: Learning Articulated Occupancy of People

2021-06-18


LEAP (LEarning Articulated occupancy of People), a novel neural occupancy representation of the human body. It is effectively an implitic version of SMPL. Given a set of bone transformations (i.e. joint locations and rotations) and a query point in space, LEAP first maps the query point to a canonical space via learned linear blend skinning (LBS) functions and then efficiently queries the occupancy value via an occupancy network that models accurate identity- and pose- dependent deformations in the canonical space.

LEAP (LEarning Articulated occupancy of People), a novel neural occupancy representation of the human body. It is effectively an implitic version of SMPL. Given a set of bone transformations (i.e. joint locations and rotations) and a query point in space, LEAP first maps the query point to a canonical space via learned linear blend skinning (LBS) functions and then efficiently queries the occupancy value via an occupancy network that models accurate identity- and pose- dependent deformations in the canonical space.

Author(s): Marko Mihajlovic, Yan Zhang, Michael J. Black, Siyu Tang
Department(s): Perceiving Systems
Publication(s): {LEAP}: Learning Articulated Occupancy of People
Authors: Marko Mihajlovic, Yan Zhang, Michael J. Black, Siyu Tang
Release Date: 2021-06-18
Repository: https://neuralbodies.github.io/LEAP/