Show simple item record

dc.contributor.authorLe, Elizabeth P. V.
dc.contributor.authorRundo, Leonardo
dc.contributor.authorTarkin, Jason M.
dc.contributor.authorEvans, Nicholas R.
dc.contributor.authorChowdhury, Mohammed M.
dc.contributor.authorCoughlin, Patrick A.
dc.contributor.authorPavey, Holly
dc.contributor.authorWall, Chris
dc.contributor.authorZaccagna, Fulvio
dc.contributor.authorGallagher, Ferdia A.
dc.contributor.authorHuang, Yuan
dc.contributor.authorSriranjan, Rouchelle
dc.contributor.authorLe, Anthony
dc.contributor.authorWeir-McCall, Jonathan R.
dc.contributor.authorRoberts, Michael
dc.contributor.authorGilbert, Fiona J.
dc.contributor.authorWarburton, Elizabeth A.
dc.contributor.authorSchönlieb, Carola-Bibiane
dc.contributor.authorSala, Evis
dc.contributor.authorRudd, James H. F.
dc.descriptionFunder: School of Clinical Medicine, University of Cambridge; doi:
dc.descriptionFunder: Frank Edward Elmore Fund
dc.descriptionFunder: National Institute for Health Research (NIHR) Imperial Biomedical Research Centre
dc.descriptionFunder: British Heart Foundation Cambridge Centre of Research Excellence
dc.descriptionFunder: Royal College of Surgeons of England; doi:
dc.descriptionFunder: Cancer Research UK; doi:
dc.descriptionFunder: AstraZeneca Oncology R
dc.descriptionFunder: National Institute for Health Research; doi:
dc.descriptionFunder: Leverhulme Trust; doi:
dc.descriptionFunder: Cantab Capital Institute for the Mathematics of Information
dc.descriptionFunder: Alan Turing Institute; doi:
dc.descriptionFunder: NIHR Cambridge Biomedical Research Centre
dc.descriptionFunder: Higher Education Funding Council for England; doi:
dc.description.abstractAbstract: Radiomics, quantitative feature extraction from radiological images, can improve disease diagnosis and prognostication. However, radiomic features are susceptible to image acquisition and segmentation variability. Ideally, only features robust to these variations would be incorporated into predictive models, for good generalisability. We extracted 93 radiomic features from carotid artery computed tomography angiograms of 41 patients with cerebrovascular events. We tested feature robustness to region-of-interest perturbations, image pre-processing settings and quantisation methods using both single- and multi-slice approaches. We assessed the ability of the most robust features to identify culprit and non-culprit arteries using several machine learning algorithms and report the average area under the curve (AUC) from five-fold cross validation. Multi-slice features were superior to single for producing robust radiomic features (67 vs. 61). The optimal image quantisation method used bin widths of 25 or 30. Incorporating our top 10 non-redundant robust radiomics features into ElasticNet achieved an AUC of 0.73 and accuracy of 69% (compared to carotid calcification alone [AUC: 0.44, accuracy: 46%]). Our results provide key information for introducing carotid CT radiomics into clinical practice. If validated prospectively, our robust carotid radiomic set could improve stroke prediction and target therapies to those at highest risk.
dc.publisherNature Publishing Group UK
dc.rightsAttribution 4.0 International (CC BY 4.0)en
dc.titleAssessing robustness of carotid artery CT angiography radiomics in the identification of culprit lesions in cerebrovascular events
prism.publicationNameScientific Reports
pubs.funder-project-idMedical Research Council (1966157)
pubs.funder-project-idThe Mark Foundation for Cancer Research and Cancer Research UK (CRUK) Cambridge Centre (C9685/A25177, C9685/A25177)
pubs.funder-project-idWellcome Trust (211100/Z/18/Z)
pubs.funder-project-idThe Dunhill Medical Trust (RTF44/0114)
pubs.funder-project-idBritish Heart Foundation (FS/16/29/31957)
pubs.funder-project-idEPSRC (EP/S026045/1 and EP/T003553/1, EP/N014588/1)
pubs.funder-project-idWellcome Innovator Award (RG98755)
pubs.funder-project-idHorizon 2020 (No. 777826 NoMADS and No. 691070 CHiPS)

Files in this item


This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International (CC BY 4.0)
Except where otherwise noted, this item's licence is described as Attribution 4.0 International (CC BY 4.0)