The Efficiency of Pre-training with Objective Masking in Pseudo Labeling for Semi-Supervised Text Classification
By: Arezoo Hatefi , Xuan-Son Vu , Monowar Bhuyan and more
Potential Business Impact:
Teaches computers to learn from less labeled text.
We extend and study a semi-supervised model for text classification proposed earlier by Hatefi et al. for classification tasks in which document classes are described by a small number of gold-labeled examples, while the majority of training examples is unlabeled. The model leverages the teacher-student architecture of Meta Pseudo Labels in which a ''teacher'' generates labels for originally unlabeled training data to train the ''student'' and updates its own model iteratively based on the performance of the student on the gold-labeled portion of the data. We extend the original model of Hatefi et al. by an unsupervised pre-training phase based on objective masking, and conduct in-depth performance evaluations of the original model, our extension, and various independent baselines. Experiments are performed using three different datasets in two different languages (English and Swedish).
Similar Papers
Auto-Labeling Data for Object Detection
CV and Pattern Recognition
Teaches computers to find things without human help.
Evolved Hierarchical Masking for Self-Supervised Learning
CV and Pattern Recognition
Teaches computers to see details better.
Thoughts on Objectives of Sparse and Hierarchical Masked Image Model
Image and Video Processing
Teaches computers to understand pictures better.