Header logo is ps

{DRAPE: DRessing Any PErson}




We describe a complete system for animating realistic clothing on synthetic bodies of any shape and pose without manual intervention. The key component of the method is a model of clothing called DRAPE (DRessing Any PErson) that is learned from a physics-based simulation of clothing on bodies of different shapes and poses. The DRAPE model has the desirable property of "factoring" clothing deformations due to body shape from those due to pose variation. This factorization provides an approximation to the physical clothing deformation and greatly simplifies clothing synthesis. Given a parameterized model of the human body with known shape and pose parameters, we describe an algorithm that dresses the body with a garment that is customized to fit and possesses realistic wrinkles. DRAPE can be used to dress static bodies or animated sequences with a learned model of the cloth dynamics. Since the method is fully automated, it is appropriate for dressing large numbers of virtual characters of varying shape. The method is significantly more efficient than physical simulation.

Author(s): Guan, P. and Reiss, L. and Hirshberg, D. and Weiss, A. and Black, M. J.
Journal: ACM Trans. on Graphics (Proc. SIGGRAPH)
Volume: 31
Number (issue): 4
Pages: 35:1--35:10
Year: 2012
Month: July

Department(s): Perceiving Systems
Research Project(s): Clothing Models (2011-2015)
Virtual Humans (2011-2015)
Bibtex Type: Article (article)
Paper Type: Journal

Links: YouTube


  title = {{DRAPE: DRessing Any PErson}},
  author = {Guan, P. and Reiss, L. and Hirshberg, D. and Weiss, A. and Black, M. J.},
  journal = {ACM Trans. on Graphics (Proc. SIGGRAPH)},
  volume = {31},
  number = {4},
  pages = {35:1--35:10},
  month = jul,
  year = {2012},
  month_numeric = {7}