QuCoWE Quantum Contrastive Word Embeddings with Variational Circuits for NearTerm Quantum Devices
By: Rabimba Karanjai , Hemanth Hegadehalli Madhavarao , Lei Xu and more
Potential Business Impact:
Teaches computers to understand words using quantum power.
We present QuCoWE a framework that learns quantumnative word embeddings by training shallow hardwareefficient parameterized quantum circuits PQCs with a contrastive skipgram objective Words are encoded by datareuploading circuits with controlled ring entanglement similarity is computed via quantum state fidelity and passed through a logitfidelity head that aligns scores with the shiftedPMI scale of SGNSNoiseContrastive Estimation To maintain trainability we introduce an entanglementbudget regularizer based on singlequbit purity that mitigates barren plateaus On Text8 and WikiText2 QuCoWE attains competitive intrinsic WordSim353 SimLex999 and extrinsic SST2 TREC6 performance versus 50100d classical baselines while using fewer learned parameters per token All experiments are run in classical simulation we analyze depolarizingreadout noise and include errormitigation hooks zeronoise extrapolation randomized compiling to facilitate hardware deployment
Similar Papers
QCSE: A Pretrained Quantum Context-Sensitive Word Embedding for Natural Language Processing
Computation and Language
Helps computers understand words by their meaning.
Evaluating Parameter-Based Training Performance of Neural Networks and Variational Quantum Circuits
Quantum Physics
Quantum computers learn with fewer parts.
Sentiment Analysis of Financial Text Using Quantum Language Processing QDisCoCirc
General Finance
Helps computers understand money news better.