Command Palette
Search for a command to run...
Corona Enric ; Pumarola Albert ; Alenyà Guillem ; Pons-Moll Gerard ; Moreno-Noguer Francesc

Abstract
In this paper we introduce SMPLicit, a novel generative model to jointlyrepresent body pose, shape and clothing geometry. In contrast to existinglearning-based approaches that require training specific models for each typeof garment, SMPLicit can represent in a unified manner different garmenttopologies (e.g. from sleeveless tops to hoodies and to open jackets), whilecontrolling other properties like the garment size or tightness/looseness. Weshow our model to be applicable to a large variety of garments includingT-shirts, hoodies, jackets, shorts, pants, skirts, shoes and even hair. Therepresentation flexibility of SMPLicit builds upon an implicit modelconditioned with the SMPL human body parameters and a learnable latent spacewhich is semantically interpretable and aligned with the clothing attributes.The proposed model is fully differentiable, allowing for its use into largerend-to-end trainable systems. In the experimental section, we demonstrateSMPLicit can be readily used for fitting 3D scans and for 3D reconstruction inimages of dressed people. In both cases we are able to go beyond state of theart, by retrieving complex garment geometries, handling situations withmultiple clothing layers and providing a tool for easy outfit editing. Tostimulate further research in this direction, we will make our code and modelpublicly available at http://www.iri.upc.edu/people/ecorona/smplicit/.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| garment-reconstruction-on-4d-dress | SMPLicit_Upper | Chamfer (cm): 2.452 IOU: 0.617 |
| garment-reconstruction-on-4d-dress | SMPLicit_Lower | Chamfer (cm): 2.101 IOU: 0.698 |
| garment-reconstruction-on-4d-dress | SMPLicit_Outer | Chamfer (cm): 3.359 IOU: 0.618 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.