HACR-Net: An Efficient hybrid attention network for MRI image super-resolution
| dc.contributor.author | Muhammad A. | |
| dc.contributor.author | Hajian A. | |
| dc.contributor.author | Achakulvisut T. | |
| dc.contributor.author | Aramvith S. | |
| dc.contributor.correspondence | Muhammad A. | |
| dc.contributor.other | Mahidol University | |
| dc.date.accessioned | 2026-04-14T18:23:51Z | |
| dc.date.available | 2026-04-14T18:23:51Z | |
| dc.date.issued | 2026-04-01 | |
| dc.description.abstract | High-resolution Magnetic Resonance Imaging (MRI) plays an important role in clinical diagnosis and pathological assessment, due to its non-invasive nature and lack of ionizing radiation. However, the acquisition of high-resolution MRI is often constrained by hardware limitations and a prolonged scanning duration. To address these limitations, super-resolution (SR) techniques have been introduced to reconstruct high-resolution images from low-resolution inputs. However, despite these advances, existing methods often struggle to effectively extract shallow features, model complex contextual dependencies, and preserve fine anatomical details. To address these limitations, we propose a Hybrid Attention and Channel Retention Network (HACR-Net) for MRI image SR. HACR-Net incorporates a Hybrid Attention Module (HAM) to mitigate information loss during shallow feature extraction by jointly leveraging channel and spatial attention, enhancing informative features, and preserving spatially significant regions. A Multiscale Feature Aggregation Block (MFAB) is incorporated to capture global structural details, local texture, and high-frequency details. Complementing MFAB, the Channel Retention Attention Block (CRAB) enhances the recovery of fine contextual detail through a bottleneck design crafted to maintain a wider channel width and reduce information loss during feature compression. Extensive experiments on two benchmark datasets, IXI and BraTS2018, demonstrate that HACR-Net achieves high-performance reconstruction with only 1.67M parameters and 81.3G FLOPs, offering significant reductions in model size and computational cost compared to existing methods. | |
| dc.identifier.citation | Plos One Vol.21 No.4 April (2026) | |
| dc.identifier.doi | 10.1371/journal.pone.0345637 | |
| dc.identifier.eissn | 19326203 | |
| dc.identifier.scopus | 2-s2.0-105035244483 | |
| dc.identifier.uri | https://repository.li.mahidol.ac.th/handle/123456789/116202 | |
| dc.rights.holder | SCOPUS | |
| dc.subject | Multidisciplinary | |
| dc.title | HACR-Net: An Efficient hybrid attention network for MRI image super-resolution | |
| dc.type | Article | |
| mu.datasource.scopus | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=105035244483&origin=inward | |
| oaire.citation.issue | 4 April | |
| oaire.citation.title | Plos One | |
| oaire.citation.volume | 21 | |
| oairecerif.author.affiliation | Mahidol University | |
| oairecerif.author.affiliation | Chulalongkorn University |
