(a-d) The target clothes, (e) the target body, (f-i) the results of draping each individual clothes, (j) the result of draping multi-layered garments in order of (f) to (i), and (k) the result of draping the clothes in a different pose.
β’
Citation
Dohae Lee, Hyun Kang, and In-Kwon Lee, "ClothCombo: Modeling Inter-Cloth Interaction for Draping Multi-Layered Clothes", ACM Transactions on Graphics 42(6), pp.1-13, presented at SIGGRAPH-ASIA 2023.
β’
Abstract
We present ClothCombo, a pipeline to drape arbitrary combinations of clothes on 3D human models with varying body shapes and poses. While existing learning-based approaches for draping clothes have shown promising results, multi-layered clothing remains challenging as it is non-trivial to model inter-cloth interaction. To this end, our method utilizes a GNN-based network to efficiently model the interaction between clothes in different layers, thus enabling multi-layered clothing. Specifically, we first create feature embedding for each cloth using a topology-agnostic network. Then, the draping network deforms all clothes to fit the target body shape and pose without considering inter-cloth interaction. Lastly, the untangling network predicts the per-vertex displacements in a way that resolves interpenetration between clothes. In experiments, the proposed model demonstrates strong performance in complex multi-layered scenarios. Being agnostic to cloth topology, our method can be readily used for layered virtual try-on of real clothes in diverse poses and combinations of clothes.
β’
Videos