Soft Contextualized Encoder For User Defined Text Classification
By: Charu Maheshwari, Vyas Raina
Potential Business Impact:
Helps computers learn new topics instantly.
User-Defined Text Classification (UDTC) considers the challenge of classifying input text to user-specified, previously unseen classes, a setting that arises frequently in real-world applications such as enterprise analytics, content moderation, and domain-specific information retrieval. We propose a soft-contextualized encoder architecture for UDTC which contextualizes each candidate label with the label set and a static soft prompt representation of the input query. Training on diverse, multi-source datasets enables the model to generalize effectively to zero-shot classification over entirely unseen topic sets drawn from arbitrary domains. We evaluate the proposed architecture both on held-out in-distribution test data and on multiple unseen UDTC benchmarks. Across datasets, the model achieves state-of-the-art performance, consistently outperforming or matching the baselines.
Similar Papers
Small sample-based adaptive text classification through iterative and contrastive description refinement
Machine Learning (CS)
Teaches computers to sort text without new training.
Advancing Text Classification with Large Language Models and Neural Attention Mechanisms
Computation and Language
Helps computers understand and sort text better.
Uncertainty-driven Embedding Convolution
Machine Learning (CS)
Makes computer language understanding more reliable.