Options
Leveraging joint incremental learning objective with data ensemble for class incremental learning
ISSN
08936080
Date Issued
2023-04-01
Author(s)
Mazumder, Pratik
Karim, Mohammed Asad
Joshi, Indu
Singh, Pravendra
DOI
10.1016/j.neunet.2023.01.017
Abstract
A class-incremental learning problem is characterized by training data becoming available in a phase-by-phase manner. Deep learning models suffer from catastrophic forgetting of the classes in the older phases as they get trained on the classes introduced in the new phase. In this work, we show that the change in orientation of an image has a considerable effect on the model prediction accuracy, which in turn demonstrates the different rates of catastrophic forgetting for the different orientations of the same image, which is a novel finding. Based on this, we propose a data-ensemble approach that combines the predictions for the different orientations of the image to help the model retain information regarding the previously seen classes and thereby reduce the rate of forgetting in the model predictions. However, we cannot directly use the data-ensemble approach if the model is trained using traditional techniques. Therefore, we also propose a novel training approach using a joint-incremental learning objective (JILO) that involves jointly training the network with two incremental learning objectives, i.e., the class-incremental learning objective and our proposed data-incremental learning objective. We empirically demonstrate that JILO is vital to the data-ensemble approach. We apply our proposed approach to state-of-the-art class-incremental learning methods and empirically show that our approach significantly improves the performance of these methods. Our proposed approach significantly improves the performance of the state-of-the-art method (AANets) on the CIFAR-100 dataset by absolute margins of 3.30%, 4.28%, 3.55%, 4.03%, for the number of phases P=50, 25, 10, and 5, respectively, which establishes the efficacy of the proposed work.