Curriculum Contrastive Learning for Aspect-based Sentiment Analysis
3
Issued Date
2025-01-01
Resource Type
ISSN
15206149
Scopus ID
2-s2.0-105009597576
Journal Title
ICASSP IEEE International Conference on Acoustics Speech and Signal Processing Proceedings
Rights Holder(s)
SCOPUS
Bibliographic Citation
ICASSP IEEE International Conference on Acoustics Speech and Signal Processing Proceedings (2025)
Suggested Citation
Jian Z., Wu D., Zeng X., Yao J., Wang M., Wu Q. Curriculum Contrastive Learning for Aspect-based Sentiment Analysis. ICASSP IEEE International Conference on Acoustics Speech and Signal Processing Proceedings (2025). doi:10.1109/ICASSP49660.2025.10890648 Retrieved from: https://repository.li.mahidol.ac.th/handle/123456789/111158
Title
Curriculum Contrastive Learning for Aspect-based Sentiment Analysis
Author's Affiliation
Corresponding Author(s)
Other Contributor(s)
Abstract
Pre-trained Language Models (PLMs) have achieved remarkable performance in various Natural Language Processing (NLP) tasks, including Aspect-based Sentiment Analysis (ABSA). Therefore, numerous ABSA models based on PLMs have been proposed, primarily focusing on module design to exploit the inherent connections between aspects and contexts. However, the core factor driving performance improvements, the PLM's powerful semantic understanding capabilities, has not been fully considered, raising the question of how to further unlock their potential for downstream tasks. To this end, we introduce a novel training strategy, called CCL, which integrates the strengths of Curriculum Learning (CurL) and Contrastive Learning (ConL) to facilitate the learning of robust feature representations. For the ABSA task, we use aspect similarities to develop the CurL strategy, grouping samples with similar aspects into batches. This allows ConL to learn more robust representations by providing related samples within each batch. The superiority of CCL is demonstrated through extensive experiments on two public ABSA datasets, with ablation studies validating the effectiveness of combining CurL and ConL in enhancing aspect understanding.
