This repo is the official implementation for DressCode: Autoregressively Sewing and Generating Garments from Text Guidance.
*Kai He, Kaixin Yao, Qixuan Zhang, [Jingyi Yu](http://www.yu-jingyi.com/), Lingjie Liu*, Lan Xu*.**
SIGGRAPH 2024 (ACM Transactions on Graphics)
Apparel’s significant role in human appearance underscores the importance of garment digitalization for digital human creation. Recent advances in 3D content creation are pivotal for digital human creation. Nonetheless, garment generation from text guidance is still nascent. We introduce a text-driven 3D garment generation framework, DressCode, which aims to democratize design for novices and offer immense potential in fashion design, virtual try-on, and digital human creation. We first introduce SewingGPT, a GPT-based architecture integrating cross-attention with text-conditioned embedding to generate sewing patterns with text guidance. We then tailor a pre-trained Stable Diffusion to generate tile-based Physically-based Rendering (PBR) textures for the garments. By leveraging a large language model, our framework generates CG-friendly garments through natural language interaction. It also facilitates pattern completion and texture editing, streamlining the design process through user-friendly interaction. This framework fosters innovation by allowing creators to freely experiment with designs and incorporate unique elements into their work. With comprehensive evaluations and comparisons with other state-of-the-art methods, our method showcases superior quality and alignment with input prompts. User studies further validate our high-quality rendering results, highlighting its practical utility and potential in production settings.
git clone https://github.com/IHe-KaiI/DressCode.git
cd DressCode
conda env create -f environment.yaml
conda activate DressCode
./packages
to PYTHONPATH
for importing custom modules.Update system.json
with local paths in your system.
"output"
)"datasets_path"
)"caption_path"
)"human_obj_path"
)"sim_json_path"
)"HDR_path"
)"blender_path"
)"maya_path"
)"dataset_properties_path"
)./models--stabilityai--stable-diffusion-2-1-base
for the CLIP embedding module."wandb_username"
) in system.json
if needed.python nn/train.py -c ./models/train.yaml
.tensorboard --logdir=PATH_TO_RECORD_FOLDER
, and our script will save the training records in ./tensorboard
by default.Our script will attempt to resume the checkpoint from the latest one in the run_id
folder automatically when the run_id
is specified in the input config.
Download our pretrained models.
Download our pretrained SewingGPT to ./models
.
Download our pretrained PBR texture generator model to ./nn/material_gen
(optional if only testing the SewingGPT).
Test the SewingGPT with python nn/evaluation_scripts/predict_class.py -c ./models/infer.yaml
.
Test with our UI based on Gradio.
Inference sewing patterns and PBR textures with the pretrained model:
python nn/UI_chat.py
. Input Shape Prompt or Shape Prompt/Texture Prompts for inference. For example, input prompt dress, sleeveless, midi length
will produce sewing pattern results only; input prompt dress, sleeveless, midi length/green velvet
will produce both sewing patterns and PBR textures. It also supports input multiple garment prompts, with ;
(no space) to split, e.g. trouser, long length/green velvet;tank top, cropped length/khaki style
.Simulate and render the predicted results (Simulation is for Windows only):
system.json
. You may need to use the full paths to the inference output folder ("output"
) and HDR map ("HDR_path"
) for Blender rendering.python nn/UI_chat.py --sim
. The same rule applies to prompts as in Step 1.Use ChatGPT as an LLM interpreter for interactively customized garment generation:
Update the "OpenAI_API_Key"
and "https_proxy"
(if needed) in system.json
.
Test the model using our UI with python nn/UI_chat.py --sim --GPT
. This time, users can chat with agents and provide their preferences, e.g. I want to attend a party.
Our results also support users in loading generated meshes and textures to 3D clothing design software (e.g. Marvelous Designer) for subsequent simulation and animation.
python nn/multiple_patterns_vis.py --folder PATH_TO_FOLDER
. The output will be in the same folder as the input folder.nn/UI_chat.py
and generating the garment, copy the UV map of the last generated garment to the second tab of Gradio and edit the texture on the UV map. Our script will render the results with the new texture. Currently, our editing UI only supports processing one garment at a time.This project is built upon NeuralTailor. Some codes for basic operations on sewing patterns are adopted from Sewformer. Our dataset is based on [Korosteleva and Lee 2021]. We thank all the authors for their impressive repos.
@article{he2024dresscode,
title={DressCode: Autoregressively Sewing and Generating Garments from Text Guidance},
author={He, Kai and Yao, Kaixin and Zhang, Qixuan and Yu, Jingyi and Liu, Lingjie and Xu, Lan},
journal={ACM Transactions on Graphics (TOG)},
volume={43},
number={4},
pages={1--13},
year={2024},
publisher={ACM New York, NY, USA}
}