Detection of translucent flesh disorder and automatic grading of mangosteens in multi-view images
Issued Date
2025-01-01
Resource Type
ISSN
09410643
eISSN
14333058
Scopus ID
2-s2.0-105000893478
Journal Title
Neural Computing and Applications
Rights Holder(s)
SCOPUS
Bibliographic Citation
Neural Computing and Applications (2025)
Suggested Citation
Kusakunniran W., Imaromkul T., Aukkapinyo K., Thongkanchorn K., Somsong P., Tiyayon P. Detection of translucent flesh disorder and automatic grading of mangosteens in multi-view images. Neural Computing and Applications (2025). doi:10.1007/s00521-025-11165-x Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/108598
Title
Detection of translucent flesh disorder and automatic grading of mangosteens in multi-view images
Author's Affiliation
Corresponding Author(s)
Other Contributor(s)
Abstract
In this paper, convolutional neural network (CNN)-based solutions are developed for grading assessment and flesh disorder detection of mangosteens in images. The grading is set to three classes of three quality levels based on the local market, where the data were collected. In addition, three flesh disorders/status are focused in this work, including translucent flesh disorder, gamboge, and rotten. Three types of solutions are attempted in this paper. The first solution relies on the well-known CNN architectures with the transfer learning and data augmentation. The second solution is developed based on the detection model, i.e., YOLOv8. The third solution is to design a new architecture by taking into account of human expert knowledge that is used for the manual grading and detection. Multiple views of each mangosteen must be considered simultaneously for the disorder detection. Four side views should be considered together, before looking at the top and bottom views. This is a very difficult task even for the human experts. The proposed solutions are trained and evaluated on the self-collected dataset of 206 mangosteens captured under six views (i.e., top view, bottom view, and four side views). The proposed solutions could achieve the perfect accuracy of 100% for the grading and up to 78% AUC for the disorder detection.