Combining Discrete Wavelet and Cosine Transforms for Efficient Sentence Embedding
By: Rana Salama, Abdou Youssef, Mona Diab
Potential Business Impact:
Makes computers understand words and sentences better.
Wavelets have emerged as a cutting edge technology in a number of fields. Concrete results of their application in Image and Signal processing suggest that wavelets can be effectively applied to Natural Language Processing (NLP) tasks that capture a variety of linguistic properties. In this paper, we leverage the power of applying Discrete Wavelet Transforms (DWT) to word and sentence embeddings. We first evaluate, intrinsically and extrinsically, how wavelets can effectively be used to consolidate important information in a word vector while reducing its dimensionality. We further combine DWT with Discrete Cosine Transform (DCT) to propose a non-parameterized model that compresses a sentence with a dense amount of information in a fixed size vector based on locally varying word features. We show the efficacy of the proposed paradigm on downstream applications models yielding comparable and even superior (in some tasks) results to original embeddings.
Similar Papers
Semantic Compression for Word and Sentence Embeddings using Discrete Wavelet Transform
Computation and Language
Makes computer language understanding smaller, faster, better.
Diffusion Transformer meets Multi-level Wavelet Spectrum for Single Image Super-Resolution
CV and Pattern Recognition
Makes blurry pictures sharp and clear.
Efficient Neural Networks with Discrete Cosine Transform Activations
Machine Learning (CS)
Makes computer brains smaller and easier to understand.