Diagnostic Performance of Artificial Intelligence for Interpreting Thyroid Cancer in Ultrasound images
Issued Date
2023-01-01
Resource Type
ISSN
19478208
eISSN
19478216
Scopus ID
2-s2.0-85168370617
Journal Title
International Journal of Knowledge and Systems Science
Volume
13
Issue
1
Rights Holder(s)
SCOPUS
Bibliographic Citation
International Journal of Knowledge and Systems Science Vol.13 No.1 (2023)
Suggested Citation
Arunrukthavon P., Songsaeng D., Keatmanee C., Klabwong S., Ekpanyapong M., Dailey M.N. Diagnostic Performance of Artificial Intelligence for Interpreting Thyroid Cancer in Ultrasound images. International Journal of Knowledge and Systems Science Vol.13 No.1 (2023). doi:10.4018/IJKSS.309431 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/88834
Title
Diagnostic Performance of Artificial Intelligence for Interpreting Thyroid Cancer in Ultrasound images
Author's Affiliation
Other Contributor(s)
Abstract
Thyroid ultrasonography is mainly used for the detection and characterization of thyroid nodules. However, there is some limitation since the diagnostic performance remains highly subjective and depends on radiologist experiences. Therefore, artificial intelligence (AI) was expected to improve the diagnostic performance of thyroid ultrasound. To evaluate the diagnostic performance of the AI for differentiating malignant and benign thyroid nodules and compare it with that of an experienced radiologist and a third-year diagnostic radiology resident, 648 patients with 650 thyroid nodules, who underwent thyroid ultrasound guided-FNA biopsy and had a decisive diagnosis from FNA cytology at Siriraj Hospital between January 2014 and June 2020, were enrolled. Although the specificity and accuracy were slightly higher in AI than the experienced radiologist and the resident (specificity 78.85% vs. 67.31% vs. 69.23%; accuracy 78.46% vs. 70.77% vs. 70.77%, respectively), the AI showed comparable diagnostic sensitivity and specificity to the experienced radiologist and the resident (p=0.187-0.855).