Show simple item record

dc.contributor.authorKrugliak, Alexandra
dc.contributor.authorClarke, Alex
dc.date.accessioned2022-01-27T00:30:29Z
dc.date.available2022-01-27T00:30:29Z
dc.date.issued2022-02-10
dc.identifier.issn2045-2322
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/332951
dc.description.abstractOur visual environment impacts multiple aspects of cognition including perception, attention and memory, yet most studies traditionally remove or control the external environment. As a result, we have a limited understanding of neurocognitive processes beyond the controlled lab environment. Here, we aim to study neural processes in real-world environments, while also maintaining a degree of control over perception. To achieve this, we combined mobile EEG (mEEG) and augmented reality (AR), which allows us to place virtual objects into the real world. We validated this AR and mEEG approach using a well-characterised cognitive response-the face inversion effect. Participants viewed upright and inverted faces in three EEG tasks (1) a lab-based computer task, (2) walking through an indoor environment while seeing face photographs, and (3) walking through an indoor environment while seeing virtual faces. We find greater low frequency EEG activity for inverted compared to upright faces in all experimental tasks, demonstrating that cognitively relevant signals can be extracted from mEEG and AR paradigms. This was established in both an epoch-based analysis aligned to face events, and a GLM-based approach that incorporates continuous EEG signals and face perception states. Together, this research helps pave the way to exploring neurocognitive processes in real-world environments while maintaining experimental control using AR.
dc.publisherNature Publishing Group
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectAdult
dc.subjectAugmented Reality
dc.subjectCognition
dc.subjectElectroencephalography
dc.subjectEnvironment
dc.subjectFacial Recognition
dc.subjectFemale
dc.subjectHumans
dc.subjectMale
dc.subjectNeurosciences
dc.subjectWalking
dc.subjectYoung Adult
dc.titleTowards real-world neuroscience using mobile EEG and augmented reality.
dc.typeArticle
dc.publisher.departmentDepartment of Psychology
dc.date.updated2022-01-26T11:00:38Z
prism.publicationNameSci Rep
dc.identifier.doi10.17863/CAM.80376
dcterms.dateAccepted2022-01-25
rioxxterms.versionofrecord10.1038/s41598-022-06296-3
rioxxterms.versionVoR
dc.contributor.orcidClarke, Alex [0000-0001-7768-5229]
dc.identifier.eissn2045-2322
rioxxterms.typeJournal Article/Review
pubs.funder-project-idWellcome Trust (211200/Z/18/Z)
cam.issuedOnline2022-02-10
cam.orpheus.successThu Feb 24 18:06:42 GMT 2022 - Embargo updated
cam.orpheus.successVoR added.
cam.orpheus.counter1
cam.depositDate2022-01-26
pubs.licence-identifierapollo-deposit-licence-2-1
pubs.licence-display-nameApollo Repository Deposit Licence Agreement


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International
Except where otherwise noted, this item's licence is described as Attribution 4.0 International