Score: 1

The Efficiency of Pre-training with Objective Masking in Pseudo Labeling for Semi-Supervised Text Classification

Published: May 10, 2025 | arXiv ID: 2505.06624v1

By: Arezoo Hatefi , Xuan-Son Vu , Monowar Bhuyan and more

Potential Business Impact:

Teaches computers to learn from less labeled text.

Business Areas:
Semantic Search Internet Services

We extend and study a semi-supervised model for text classification proposed earlier by Hatefi et al. for classification tasks in which document classes are described by a small number of gold-labeled examples, while the majority of training examples is unlabeled. The model leverages the teacher-student architecture of Meta Pseudo Labels in which a ''teacher'' generates labels for originally unlabeled training data to train the ''student'' and updates its own model iteratively based on the performance of the student on the gold-labeled portion of the data. We extend the original model of Hatefi et al. by an unsupervised pre-training phase based on objective masking, and conduct in-depth performance evaluations of the original model, our extension, and various independent baselines. Experiments are performed using three different datasets in two different languages (English and Swedish).

Country of Origin
πŸ‡ΈπŸ‡ͺ Sweden


Page Count
41 pages

Category
Computer Science:
Computation and Language