HU to RGB transformation with automatic windows selection for intracranial hemorrhage classification using ncCT
Issued Date
2025-08-01
Resource Type
eISSN
19326203
Scopus ID
2-s2.0-105012735256
Journal Title
Plos One
Volume
20
Issue
8 August
Rights Holder(s)
SCOPUS
Bibliographic Citation
Plos One Vol.20 No.8 August (2025)
Suggested Citation
Songsaeng D., Supratak A., Chantangphol P., Sarumpakul S., Kaothanthong N. HU to RGB transformation with automatic windows selection for intracranial hemorrhage classification using ncCT. Plos One Vol.20 No.8 August (2025). doi:10.1371/journal.pone.0327871 Retrieved from: https://repository.li.mahidol.ac.th/handle/123456789/111645
Title
HU to RGB transformation with automatic windows selection for intracranial hemorrhage classification using ncCT
Corresponding Author(s)
Other Contributor(s)
Abstract
This work focuses on preprocessing for classifying five categories of Intracranial Hemorrhage (ICH) using non-contrast computed tomography (ncCT). It involves assigning suitable values to window-width (WW) and window-level (WL) parameters to map Hounsfield Units on ncCT to compatible color components like RGB for display. However, clear visualization is hindered by brain component variations, individual patient conditions, and time elapsed since stroke onset. This paper introduces a preprocessing technique called HU to RGB Transformation (HRT), aimed at enhancing the visualization of hemorrhage on ncCT scans. HRT dynamically selects optimal WW and WL values from predefined settings to accentuate hemorrhage visibility. Furthermore, it leverages multiple brain components, including cerebrospinal fluid and white-and-gray matter, to further refine the delineation of hemorrhagic regions. Experimental results from a deep neural network-based image classification model are utilized to evaluate the effectiveness of the proposed method. This method, serving as an image preprocessing step, demonstrates remarkable capability in classifying five distinct types of Intracranial Hemorrhage and normal slice, achieving an average sensitivity of 89.35% and an average specificity of 96.03%. Moreover, direct assessment of HRT preprocessed images leads to enhanced type classification accuracy by residents, with a sensitivity of 97.39% and a specificity of 96.19%. These results surpass those obtained from reading DICOM files achieving 93.31% sensitivity and 94.81% specificity.
