REFA: Real-time Egocentric Facial Animations for Virtual Reality
By: Qiang Zhang , Tong Xiao , Haroun Habeeb and more
Potential Business Impact:
Lets you control virtual characters' faces easily.
We present a novel system for real-time tracking of facial expressions using egocentric views captured from a set of infrared cameras embedded in a virtual reality (VR) headset. Our technology facilitates any user to accurately drive the facial expressions of virtual characters in a non-intrusive manner and without the need of a lengthy calibration step. At the core of our system is a distillation based approach to train a machine learning model on heterogeneous data and labels coming form multiple sources, \eg synthetic and real images. As part of our dataset, we collected 18k diverse subjects using a lightweight capture setup consisting of a mobile phone and a custom VR headset with extra cameras. To process this data, we developed a robust differentiable rendering pipeline enabling us to automatically extract facial expression labels. Our system opens up new avenues for communication and expression in virtual environments, with applications in video conferencing, gaming, entertainment, and remote collaboration.
Similar Papers
Mind-to-Face: Neural-Driven Photorealistic Avatar Synthesis via EEG Decoding
CV and Pattern Recognition
Reads your thoughts to make a face move.
Deep Learning-Based Real-Time Sequential Facial Expression Analysis Using Geometric Features
CV and Pattern Recognition
Lets computers understand your feelings from your face.
EgoReAct: Egocentric Video-Driven 3D Human Reaction Generation
CV and Pattern Recognition
Makes robots copy human movements from videos.