Repository logo
 

Multimodal Integration of M/EEG and f/MRI Data in SPM12.

Published version
Peer-reviewed

Type

Article

Change log

Authors

Henson, Richard N 
Abdulrahman, Hunar 
Flandin, Guillaume 
Litvak, Vladimir 

Abstract

We describe the steps involved in analysis of multi-modal, multi-subject human neuroimaging data using the SPM12 free and open source software (https://www.fil.ion.ucl.ac.uk/spm/) and a publically-available dataset organized according to the Brain Imaging Data Structure (BIDS) format (https://openneuro.org/datasets/ds000117/). The dataset contains electroencephalographic (EEG), magnetoencephalographic (MEG), and functional and structural magnetic resonance imaging (MRI) data from 16 subjects who undertook multiple runs of a simple task performed on a large number of famous, unfamiliar and scrambled faces. We demonstrate: (1) batching and scripting of preprocessing of multiple runs/subjects of combined MEG and EEG data, (2) creation of trial-averaged evoked responses, (3) source-reconstruction of the power (induced and evoked) across trials within a time-frequency window around the "N/M170" evoked component, using structural MRI for forward modeling and simultaneous inversion (fusion) of MEG and EEG data, (4) group-based optimisation of spatial priors during M/EEG source reconstruction using fMRI data on the same paradigm, and (5) statistical mapping across subjects of cortical source power increases for faces vs. scrambled faces.

Description

Keywords

EEG, MEG, SPM, fMRI, faces, fusion, inversion, multimodal

Journal Title

Frontiers in Neuroscience

Conference Name

Journal ISSN

1662-4548
1662-453X

Volume Title

13

Publisher

Frontiers Media
Sponsorship
MRC (Unknown)
Medical Research Council (MC_UU_00005/8)
Medical Research Council (MR/K005464/1)
This work was supported by MRC programme grant to RH (SUAG/010 RG91365). GF and VL are supported by core funding from the Wellcome Trust (203147/Z/16/Z). The work was also part of the UK MEG community supported by Medical Research Council grant MR/K005464/1.