Neonatal pose estimation in the unaltered clinical environment with fusion of RGB, depth and IR images.
Published version
Peer-reviewed
Repository URI
Repository DOI
Type
Change log
Abstract
Visual monitoring of pre-term infants in intensive care is critical to ensuring proper development and treatment. Camera systems have been explored for this purpose, with human pose estimation having applications in monitoring position, motion, behaviour and vital signs. Validation in the full range of clinical visual scenarios is necessary to prove real-life utility. We conducted a clinical study to collect RGB, depth and infra-red video from 24 participants with no modifications to clinical care. We propose and train image fusion pose estimation algorithms for locating the torso key-points. Our best-performing approach, a late fusion method, achieves an average precision score of 0.811. Chest covering or side lying decrease the object key-point similarity score by 0.15 and 0.1 respectively, while accounting for 50% and 44% of the time. The baby's positioning and covering supports their development and comfort, and these scenarios should therefore be considered when validating visual monitoring algorithms.
Description
Acknowledgements: We are grateful to all participants, and their parents, for taking part in the study. We also thank the staff on the neonatal unit for their support in undertaking the study in the busy NICU. This work is funded by the Rosetrees Trust, Stoneygate Trust, and Isaac Newton Trust, with funding support from the Cambridge Centre for Data-Driven Discovery and Accelerate Programme for Scientific Discovery, made possible by a donation from Schmidt Futures, and Trinity College, University of Cambridge. This work was performed using resources provided by the Cambridge Service for Data Driven Discovery (CSD3) operated by the University of Cambridge Research Computing Service (www.csd3.cam.ac.uk), provided by Dell EMC and Intel using Tier-2 funding from the Engineering and Physical Sciences Research Council (capital grant EP/T022159/1), and DiRAC funding from the Science and Technology Facilities Council (www.dirac.ac.uk).
Funder: Isaac Newton Trust; doi: https://doi.org/10.13039/501100004815
Funder: Cambridge Centre for Data Driven Discovery
Funder: Trinity College, University of Cambridge; doi: https://doi.org/10.13039/501100000727
Keywords
Journal Title
Conference Name
Journal ISSN
2398-6352