Options
Surprising Effectiveness of Random Feature Embeddings in eXtreme Classification
ISSN
10514651
Date Issued
2022-01-01
Author(s)
Verma, Yashaswi
DOI
10.1109/ICPR56361.2022.9956663
Abstract
The goal of eXtreme Multi-label Learning (XML) is to automatically annotate a given data point with the most relevant subset of labels from an extremely large vocabulary of labels (e.g., a million labels). Lately, many attempts have been made to address this problem that achieve reasonable performance on benchmark datasets. In this paper, rather than coming-up with an altogether new method, our objective is to present and validate a simple baseline for this task. Precisely, we investigate a global and structure preserving feature embedding technique using random projections whose learning phase is independent of training samples and label vocabulary. Further, we show how an ensemble of multiple such learners can be used to achieve further boost in prediction accuracy with only linear increase in training and prediction time. Experiments on three public XML benchmarks show that the proposed approach obtains competitive accuracy compared with many existing methods. Additionally, it also provides around 6572× speed-up ratio in terms of training time and around 14.7× reduction in model-size compared to the closest competitors on the largest public dataset. We have also shared our code for reproducibility.