Order Matters: 3D Shape Generation from Sequential VR Sketches
By: Yizi Chen , Sidi Wu , Tianyi Xiao and more
Potential Business Impact:
Turns 3D drawings into real shapes faster.
VR sketching lets users explore and iterate on ideas directly in 3D, offering a faster and more intuitive alternative to conventional CAD tools. However, existing sketch-to-shape models ignore the temporal ordering of strokes, discarding crucial cues about structure and design intent. We introduce VRSketch2Shape, the first framework and multi-category dataset for generating 3D shapes from sequential VR sketches. Our contributions are threefold: (i) an automated pipeline that generates sequential VR sketches from arbitrary shapes, (ii) a dataset of over 20k synthetic and 900 hand-drawn sketch-shape pairs across four categories, and (iii) an order-aware sketch encoder coupled with a diffusion-based 3D generator. Our approach yields higher geometric fidelity than prior work, generalizes effectively from synthetic to real sketches with minimal supervision, and performs well even on partial sketches. All data and models will be released open-source at https://chenyizi086.github.io/VRSketch2Shape_website.
Similar Papers
ShapeGen: Towards High-Quality 3D Shape Synthesis
CV and Pattern Recognition
Creates detailed 3D objects from pictures.
Drawing2CAD: Sequence-to-Sequence Learning for CAD Generation from Vector Drawings
CV and Pattern Recognition
Turns 2D drawings into 3D computer models.
sketch2symm: Symmetry-aware sketch-to-shape generation via semantic bridging
CV and Pattern Recognition
Turns simple drawings into 3D objects.