Simple jQuery Dropdowns
Please use this identifier to cite or link to this item: http://repository.li.mahidol.ac.th/dspace/handle/123456789/45614
Title: Machine Learning Methods for Assessing Freshness in Hydroponic Produce
Authors: Konlakorn Wongpatikaseree
Narit Hnoohom
Sumeth Yuenyong
Mahidol University
Keywords: Computer Science;Medicine
Issue Date: 2-Jul-2018
Citation: 2018 International Joint Symposium on Artificial Intelligence and Natural Language Processing, iSAI-NLP 2018 - Proceedings. (2018)
Abstract: © 2018 IEEE. Smart farms are increasing in both number and level of technology used. Image processing had been applied to hydroponic farms to detect disease in plants, but detecting the freshness of vegetable had not been addressed as much. In this work we applied image processing and machine learning technologies to the task of distinguishing between fresh and withered vegetable. We compared 3 classical machine learning classifier: decision tree, Naive Bayes, Multi-Layer Perceptron; and one type of deep neural network. Manual feature extraction was performed for the classical machine learning, while the input to the deep neural network was the raw images. We collected the data by taking one image of the vegetable every 10 minutes for one week each time. We labeled the data by considering vegetable from day 1 and day 2 to be fresh while from day 3 onward was considered wither. Experiment results show that the best model for this task was decision tree with a test accuracy of 98.12%. Deep neural network did not perform as well as expected. We hypothesize that the reason is due to overfitting of the training data since the training accuracy for deep neural network was as high or even higher than other classifiers.
URI: https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85065089779&origin=inward
http://repository.li.mahidol.ac.th/dspace/handle/123456789/45614
Appears in Collections:Scopus 2018

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.