Quantum-Enhanced Natural Language Generation: A Multi-Model Framework with Hybrid Quantum-Classical Architectures
By: Chi-Sheng Chen, En-Jui Kuo
Potential Business Impact:
Quantum computers write text, sometimes with unique skills.
This paper presents a comprehensive evaluation of quantum text generation models against traditional Transformer/MLP architectures, addressing the growing interest in quantum computing applications for natural language processing. We conduct systematic experiments comparing five distinct models: Transformer (baseline), Quantum Kernel Self-Attention Network (QKSAN), Quantum RWKV (QRWKV), and Quantum Attention Sequence Architecture (QASA) across five diverse datasets including simple sentences, short stories, quantum phrases, haiku poetry, and proverbs. Our evaluation employs multiple metrics including perplexity, BLEU scores, vocabulary diversity, repetition rates, and fluency measures to assess different aspects of text generation quality. The experimental results reveal that while traditional Transformer models maintain overall superiority with the lowest average perplexity (1.21) and highest BLEU-1 score (0.2895), quantum-inspired models demonstrate competitive performance in specific scenarios. Notably, QKSAN achieves a competitive BLEU-1 score of 0.2800 while maintaining zero repetition rates, and QRWKV demonstrates perfect vocabulary diversity (Distinct-1 = 1.000) in certain tasks.
Similar Papers
Quantum Large Language Model Fine-Tuning
Quantum Physics
Makes computers understand words better using quantum power.
Quantum Graph Transformer for NLP Sentiment Classification
Computation and Language
Helps computers understand words better with less data.
Quantum-Enhanced Attention Mechanism in NLP: A Hybrid Classical-Quantum Approach
Computation and Language
Computers understand words better using quantum power.