Score: 0

FastFace: Tuning Identity Preservation in Distilled Diffusion via Guidance and Attention

Published: May 27, 2025 | arXiv ID: 2505.21144v2

By: Sergey Karpukhin , Vadim Titov , Andrey Kuznetsov and more

Potential Business Impact:

Makes AI art generators create faces faster.

Business Areas:
Facial Recognition Data and Analytics, Software

In latest years plethora of identity-preserving adapters for a personalized generation with diffusion models have been released. Their main disadvantage is that they are dominantly trained jointly with base diffusion models, which suffer from slow multi-step inference. This work aims to tackle the challenge of training-free adaptation of pretrained ID-adapters to diffusion models accelerated via distillation - through careful re-design of classifier-free guidance for few-step stylistic generation and attention manipulation mechanisms in decoupled blocks to improve identity similarity and fidelity, we propose universal FastFace framework. Additionally, we develop a disentangled public evaluation protocol for id-preserving adapters.

Page Count
23 pages

Category
Computer Science:
CV and Pattern Recognition