EarSet: A Multi-Modal Dataset for Studying the Impact of Head and Facial Movements on In-Ear PPG Signals.
Published version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
Photoplethysmography (PPG) is a simple, yet powerful technique to study blood volume changes by measuring light intensity variations. However, PPG is severely affected by motion artifacts, which hinder its trustworthiness. This problem is pressing in earables since head movements and facial expressions cause skin and tissue displacements around and inside the ear. Understanding such artifacts is fundamental to the success of earables for accurate cardiovascular health monitoring. However, the lack of in-ear PPG datasets prevents the research community from tackling this challenge. In this work, we report on the design of an ear tip featuring a 3-channels PPG and a co-located 6-axis motion sensor. This, enables sensing PPG data at multiple wavelengths and the corresponding motion signature from both ears. Leveraging our device, we collected a multi-modal dataset from 30 participants while performing 16 natural motions, including both head/face and full body movements. This unique dataset will greatly support research towards making in-ear vital signs sensing more accurate and robust, thus unlocking the full potential of the next-generation PPG-equipped earables.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
2052-4463