Score: 1

Expressive and Scalable Quantum Fusion for Multimodal Learning

Published: October 8, 2025 | arXiv ID: 2510.06938v1

By: Tuyen Nguyen , Trong Nghia Hoang , Phi Le Nguyen and more

Potential Business Impact:

Makes computers learn from many things better.

Business Areas:
Quantum Computing Science and Engineering

The aim of this paper is to introduce a quantum fusion mechanism for multimodal learning and to establish its theoretical and empirical potential. The proposed method, called the Quantum Fusion Layer (QFL), replaces classical fusion schemes with a hybrid quantum-classical procedure that uses parameterized quantum circuits to learn entangled feature interactions without requiring exponential parameter growth. Supported by quantum signal processing principles, the quantum component efficiently represents high-order polynomial interactions across modalities with linear parameter scaling, and we provide a separation example between QFL and low-rank tensor-based methods that highlights potential quantum query advantages. In simulation, QFL consistently outperforms strong classical baselines on small but diverse multimodal tasks, with particularly marked improvements in high-modality regimes. These results suggest that QFL offers a fundamentally new and scalable approach to multimodal fusion that merits deeper exploration on larger systems.

Country of Origin
🇦🇺 Australia

Page Count
22 pages

Category
Physics:
Quantum Physics