Optimizing BERT for Sentiment Classification of Amazon Product Reviews: A Study on Class Imbalance and Misclassification Analysis

dc.contributor.authorKorsakpaisarn O.
dc.contributor.authorNoraset T.
dc.contributor.authorLapamnuaypol J.
dc.contributor.authorJin'no K.
dc.contributor.correspondenceKorsakpaisarn O.
dc.contributor.otherMahidol University
dc.date.accessioned2026-03-20T18:19:04Z
dc.date.available2026-03-20T18:19:04Z
dc.date.issued2025-01-01
dc.description.abstractThis paper presents a custom BERT-based sentiment classification pipeline for Amazon product reviews. Reviews are grouped into three sentiment classes - bad (1-2 stars), normal (3 stars), and good (4-5 stars). Two practical challenges are addressed: the semantic ambiguity of the neutral ('normal') class and skewed class distributions that bias prediction toward majority categories. In particular, the underrepresentation of neutral reviews often yields models that favor the dominant classes, limiting effectiveness in real applications. To mitigate these issues, stratified sampling and class-weighted cross-entropy are applied while fine-tuning bert-base-uncased. A moderate emphasis on the normal class increases its recall and F1 without significantly lowering overall accuracy, according to multi-run trials across four weight settings.
dc.identifier.citation2025 20th International Joint Symposium on Artificial Intelligence and Natural Language Processing Isai Nlp 2025 (2025)
dc.identifier.doi10.1109/iSAI-NLP66160.2025.11320736
dc.identifier.scopus2-s2.0-105032731380
dc.identifier.urihttps://repository.li.mahidol.ac.th/handle/123456789/115793
dc.rights.holderSCOPUS
dc.subjectComputer Science
dc.subjectEngineering
dc.titleOptimizing BERT for Sentiment Classification of Amazon Product Reviews: A Study on Class Imbalance and Misclassification Analysis
dc.typeConference Paper
mu.datasource.scopushttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=105032731380&origin=inward
oaire.citation.title2025 20th International Joint Symposium on Artificial Intelligence and Natural Language Processing Isai Nlp 2025
oairecerif.author.affiliationMahidol University
oairecerif.author.affiliationTokyo City University

Files

Collections