DualFit: A Two-Stage Virtual Try-On via Warping and Synthesis
By: Minh Tran , Johnmark Clements , Annie Prasanna and more
Potential Business Impact:
Lets you try on clothes online, keeping logos clear.
Virtual Try-On technology has garnered significant attention for its potential to transform the online fashion retail experience by allowing users to visualize how garments would look on them without physical trials. While recent advances in diffusion-based warping-free methods have improved perceptual quality, they often fail to preserve fine-grained garment details such as logos and printed text elements that are critical for brand integrity and customer trust. In this work, we propose DualFit, a hybrid VTON pipeline that addresses this limitation by two-stage approach. In the first stage, DualFit warps the target garment to align with the person image using a learned flow field, ensuring high-fidelity preservation. In the second stage, a fidelity-preserving try-on module synthesizes the final output by blending the warped garment with preserved human regions. Particularly, to guide this process, we introduce a preserved-region input and an inpainting mask, enabling the model to retain key areas and regenerate only where necessary, particularly around garment seams. Extensive qualitative results show that DualFit achieves visually seamless try-on results while faithfully maintaining high-frequency garment details, striking an effective balance between reconstruction accuracy and perceptual realism.
Similar Papers
UniFit: Towards Universal Virtual Try-on with MLLM-Guided Semantic Alignment
CV and Pattern Recognition
Lets you try on clothes virtually with text.
DS-VTON: High-Quality Virtual Try-on via Disentangled Dual-Scale Generation
CV and Pattern Recognition
Lets you try on clothes virtually, perfectly.
Two-Way Garment Transfer: Unified Diffusion Framework for Dressing and Undressing Synthesis
CV and Pattern Recognition
Lets you take clothes off virtual people.