Beyond Isolated Words: Diffusion Brush for Handwritten Text-Line Generation
By: Gang Dai , Yifan Zhang , Yutao Qin and more
Potential Business Impact:
Makes computers write handwriting that looks real.
Existing handwritten text generation methods primarily focus on isolated words. However, realistic handwritten text demands attention not only to individual words but also to the relationships between them, such as vertical alignment and horizontal spacing. Therefore, generating entire text lines emerges as a more promising and comprehensive task. However, this task poses significant challenges, including the accurate modeling of complex style patterns encompassing both intra- and inter-word relationships, and maintaining content accuracy across numerous characters. To address these challenges, we propose DiffBrush, a novel diffusion-based model for handwritten text-line generation. Unlike existing methods, DiffBrush excels in both style imitation and content accuracy through two key strategies: (1) content-decoupled style learning, which disentangles style from content to better capture intra-word and inter-word style patterns by using column- and row-wise masking; and (2) multi-scale content learning, which employs line and word discriminators to ensure global coherence and local accuracy of textual content. Extensive experiments show that DiffBrush excels in generating high-quality text lines, particularly in style reproduction and content preservation. Code is available at https://github.com/dailenson/DiffBrush.
Similar Papers
DiffBrush:Just Painting the Art by Your Hands
CV and Pattern Recognition
Draw pictures with AI, just by sketching.
Layout Stroke Imitation: A Layout Guided Handwriting Stroke Generation for Style Imitation with Diffusion Model
CV and Pattern Recognition
Makes computers write handwriting like a person.
DesignDiffusion: High-Quality Text-to-Design Image Generation with Diffusion Models
CV and Pattern Recognition
Creates pictures from words for designs.