Show simple item record

dc.contributor.authorHauk, Olaf
dc.contributor.authorStenroos, Matti
dc.contributor.authorTreder, Matthias S
dc.date.accessioned2022-06-21T23:31:14Z
dc.date.available2022-06-21T23:31:14Z
dc.date.issued2022-07-15
dc.identifier.issn1053-8119
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/338283
dc.description.abstractThe spatial resolution of EEG/MEG source estimates, often described in terms of source leakage in the context of the inverse problem, poses constraints on the inferences that can be drawn from EEG/MEG source estimation results. Software packages for EEG/MEG data analysis offer a large choice of source estimation methods but few tools to experimental researchers for methods evaluation and comparison. Here, we describe a framework and tools for objective and intuitive resolution analysis of EEG/MEG source estimation based on linear systems analysis, and apply those to the most widely used distributed source estimation methods such as L2-minimum-norm estimation (L2-MNE) and linearly constrained minimum variance (LCMV) beamformers. Within this framework it is possible to define resolution metrics that define meaningful aspects of source estimation results (such as localization accuracy in terms of peak localization error, PLE, and spatial extent in terms of spatial deviation, SD) that are relevant to the task at hand and can easily be visualized. At the core of this framework is the resolution matrix, which describes the potential leakage from and into point sources (point-spread and cross-talk functions, or PSFs and CTFs, respectively). Importantly, for linear methods these functions allow generalizations to multiple sources or complex source distributions. This paper provides a tutorial-style introduction into linear EEG/MEG source estimation and resolution analysis aimed at experimental (rather than methods-oriented) researchers. We used this framework to demonstrate how L2-MNE-type as well as LCMV beamforming methods can be evaluated in practice using software tools that have only recently become available for routine use. Our novel methods comparison includes PLE and SD for a larger number of methods than in similar previous studies, such as unweighted, depth-weighted and normalized L2-MNE methods (including dSPM, sLORETA, eLORETA) and two LCMV beamformers. The results demonstrate that some methods can achieve low and even zero PLE for PSFs. However, their SD as well as both PLE and SD for CTFs are far less optimal for all methods, in particular for deep cortical areas. We hope that our paper will encourage EEG/MEG researchers to apply this approach to their own tasks at hand.
dc.format.mediumPrint-Electronic
dc.publisherElsevier BV
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectBeamforming
dc.subjectCross-talk function
dc.subjectInverse problem
dc.subjectLocalization error
dc.subjectMinimum-norm estimation
dc.subjectPoint-spread function
dc.subjectResolution matrix
dc.subjectSpatial deviation
dc.subjectSpatial filter
dc.subjectAlgorithms
dc.subjectBrain
dc.subjectBrain Mapping
dc.subjectElectroencephalography
dc.subjectHumans
dc.subjectMagnetoencephalography
dc.subjectSoftware
dc.titleTowards an objective evaluation of EEG/MEG source estimation methods - The linear approach.
dc.typeArticle
dc.publisher.departmentMrc Cognition And Brain Sciences Unit
dc.date.updated2022-06-21T08:35:14Z
prism.number119177
prism.publicationDate2022
prism.publicationNameNeuroimage
prism.startingPage119177
prism.volume255
dc.identifier.doi10.17863/CAM.85691
dcterms.dateAccepted2022-03-31
rioxxterms.versionofrecord10.1016/j.neuroimage.2022.119177
rioxxterms.versionVoR
dc.contributor.orcidHauk, Olaf [0000-0003-0817-6054]
dc.identifier.eissn1095-9572
rioxxterms.typeJournal Article/Review
cam.issuedOnline2022-04-04
cam.depositDate2022-06-21
pubs.licence-identifierapollo-deposit-licence-2-1
pubs.licence-display-nameApollo Repository Deposit Licence Agreement


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International
Except where otherwise noted, this item's licence is described as Attribution 4.0 International