Single-Head Lifelong Learning Based on Distilling Knowledge
Issued Date
2022-01-01
Resource Type
eISSN
21693536
Scopus ID
2-s2.0-85125744030
Journal Title
IEEE Access
Volume
10
Start Page
35469
End Page
35478
Rights Holder(s)
SCOPUS
Bibliographic Citation
IEEE Access Vol.10 (2022) , 35469-35478
Suggested Citation
Wang Y.H., Lin C.Y., Thaipisutikul T., Shih T.K. Single-Head Lifelong Learning Based on Distilling Knowledge. IEEE Access Vol.10 (2022) , 35469-35478. 35478. doi:10.1109/ACCESS.2022.3155451 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/84403
Title
Single-Head Lifelong Learning Based on Distilling Knowledge
Author(s)
Author's Affiliation
Other Contributor(s)
Abstract
Within the machine learning field, the main purpose of lifelong learning, also known as continuous learning, is to enable neural networks to learn continuously, as humans do. Lifelong learning accumulates the knowledge learned from previous tasks and transfers it to support the neural network in future tasks. This technique not only avoids the catastrophic forgetting problem with previous tasks when training new tasks, but also makes the model more robust with the temporal evolution. Motivated by the recent intervention of the lifelong learning technique, this paper presents a novel feature-based knowledge distillation method that differs from the existing methods of knowledge distillation in lifelong learning. Specifically, our proposed method utilizes the features from intermediate layers and compresses them in a unique way that involves global average pooling and fully connected layers. The authors then use the output of this branch network to deliver information from previous tasks to the model in the future. Extensive experiments show that our proposed model consistency outperforms the state-of-the-art baselines with the accuracy metric by at least two percent improvement under different experimental settings.