SAMP Dataset
2021-08-17
SAMP dataset is high-quality MoCap data covering various sitting, lying down, walking, and running styles. We capture the motion of the body as well as the object.
Existing large-scale MoCap datasets are largely dominated by locomotion and the few interaction examples lack diversity. Additionally, traditional MoCap focuses on the body and rarely captures the scene. Hence, we capture a new dataset covering various human-scene interactions with multiple objects. In each motion sequence, we track both the body motion and the object using a high resolution optical marker MoCap system.
Author(s): | Mohamed Hassan and Duygu Ceylan and Ruben Villegas and Jun Saito and Jimei Yang and Yi Zhou and Michael Black |
Department(s): |
Perceiving Systems |
Research Projects(s): |
Putting People into Scenes |
Publication(s): |
Stochastic Scene-Aware Motion Prediction
|
Authors: | Mohamed Hassan and Duygu Ceylan and Ruben Villegas and Jun Saito and Jimei Yang and Yi Zhou and Michael Black |
Release Date: | 2021-08-17 |
External Link: | https://samp.is.tue.mpg.de/ |