Impact of GAN-based lesion-focused medical image super-resolution on the robustness of radiomic features.
Published version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
Robust machine learning models based on radiomic features might allow for accurate diagnosis, prognosis, and medical decision-making. Unfortunately, the lack of standardized radiomic feature extraction has hampered their clinical use. Since the radiomic features tend to be affected by low voxel statistics in regions of interest, increasing the sample size would improve their robustness in clinical studies. Therefore, we propose a Generative Adversarial Network (GAN)-based lesion-focused framework for Computed Tomography (CT) image Super-Resolution (SR); for the lesion (i.e., cancer) patch-focused training, we incorporate Spatial Pyramid Pooling (SPP) into GAN-Constrained by the Identical, Residual, and Cycle Learning Ensemble (GAN-CIRCLE). At [Formula: see text] SR, the proposed model achieved better perceptual quality with less blurring than the other considered state-of-the-art SR methods, while producing comparable results at [Formula: see text] SR. We also evaluated the robustness of our model's radiomic feature in terms of quantization on a different lung cancer CT dataset using Principal Component Analysis (PCA). Intriguingly, the most important radiomic features in our PCA-based analysis were the most robust features extracted on the GAN-super-resolved images. These achievements pave the way for the application of GAN-based image Super-Resolution techniques for studies of radiomics for robust biomarker discovery.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
2045-2322
Volume Title
Publisher
Publisher DOI
Sponsorship
Cancer Research UK (C96/A25177)
National Institute for Health and Care Research (IS-BRC-1215-20014)
EPSRC (EP/T017961/1)
Cancer Research UK (C197/A28667)