Dyads: Artist-Centric, AI-Generated Dance Duets
By: Zixuan Wang , Luis Zerkowski , Ilya Vidrin and more
Potential Business Impact:
AI creates dance partners that move together.
Existing AI-generated dance methods primarily train on motion capture data from solo dance performances, but a critical feature of dance in nearly any genre is the interaction of two or more bodies in space. Moreover, many works at the intersection of AI and dance fail to incorporate the ideas and needs of the artists themselves into their development process, yielding models that produce far more useful insights for the AI community than for the dance community. This work addresses both needs of the field by proposing an AI method to model the complex interactions between pairs of dancers and detailing how the technical methodology can be shaped by ongoing co-creation with the artistic stakeholders who curated the movement data. Our model is a probability-and-attention-based Variational Autoencoder that generates a choreographic partner conditioned on an input dance sequence. We construct a custom loss function to enhance the smoothness and coherence of the generated choreography. Our code is open-source, and we also document strategies for other interdisciplinary research teams to facilitate collaboration and strong communication between artists and technologists.
Similar Papers
Reimagining Dance: Real-time Music Co-creation between Dancers and AI
Sound
Dancers control music with their moves.
Invisible Strings: Revealing Latent Dancer-to-Dancer Interactions with Graph Neural Networks
CV and Pattern Recognition
Shows how dancers connect through movement.
DanceMeld: Unraveling Dance Phrases with Hierarchical Latent Codes for Music-to-Dance Synthesis
Other Computer Science
Makes computers create realistic dance moves from music.