Context-Enhanced Contrastive Search for Improved LLM Text Generation
By: Jaydip Sen, Rohit Pandey, Hetvi Waghela
Potential Business Impact:
Makes computer writing sound more natural and interesting.
Recently, Large Language Models (LLMs) have demonstrated remarkable advancements in Natural Language Processing (NLP). However, generating high-quality text that balances coherence, diversity, and relevance remains challenging. Traditional decoding methods, such as bean search and top-k sampling, often struggle with either repetitive or incoherent outputs, particularly in tasks that require long-form text generation. To address these limitations, the paper proposes a novel enhancement of the well-known Contrastive Search algorithm, Context-Enhanced Contrastive Search (CECS) with contextual calibration. The proposed scheme introduces several novelties including dynamic contextual importance weighting, multi-level Contrastive Search, and adaptive temperature control, to optimize the balance between fluency, creativity, and precision. The performance of CECS is evaluated using several standard metrics such as BLEU, ROUGE, and semantic similarity. Experimental results demonstrate significant improvements in both coherence and relevance of the generated texts by CECS outperforming the existing Contrastive Search techniques. The proposed algorithm has several potential applications in the real world including legal document drafting, customer service chatbots, and content marketing.
Similar Papers
CEC-Zero: Chinese Error Correction Solution Based on LLM
Computation and Language
Teaches computers to fix Chinese text errors alone.
Extracting and Emulsifying Cultural Explanation to Improve Multilingual Capability of LLMs
Computation and Language
Helps computers understand different cultures for better answers.
Exploiting Contextual Knowledge in LLMs through V-usable Information based Layer Enhancement
Computation and Language
Helps computers remember and use information better.