Evaluating Spatial Hearing Using a Dual-Task Approach in a Virtual-Acoustics Environment.
Authors
Salorio-Corbetto, Marina
Williges, Ben
Lamping, Wiebke
Picinali, Lorenzo
Vickers, Deborah
Publication Date
2022Journal Title
Front Neurosci
ISSN
1662-4548
Publisher
Frontiers Media SA
Volume
16
Language
en
Type
Article
This Version
VoR
Metadata
Show full item recordCitation
Salorio-Corbetto, M., Williges, B., Lamping, W., Picinali, L., & Vickers, D. (2022). Evaluating Spatial Hearing Using a Dual-Task Approach in a Virtual-Acoustics Environment.. Front Neurosci, 16 https://doi.org/10.3389/fnins.2022.787153
Abstract
Spatial hearing is critical for communication in everyday sound-rich environments. It is important to gain an understanding of how well users of bilateral hearing devices function in these conditions. The purpose of this work was to evaluate a Virtual Acoustics (VA) version of the Spatial Speech in Noise (SSiN) test, the SSiN-VA. This implementation uses relatively inexpensive equipment and can be performed outside the clinic, allowing for regular monitoring of spatial-hearing performance. The SSiN-VA simultaneously assesses speech discrimination and relative localization with changing source locations in the presence of noise. The use of simultaneous tasks increases the cognitive load to better represent the difficulties faced by listeners in noisy real-world environments. Current clinical assessments may require costly equipment which has a large footprint. Consequently, spatial-hearing assessments may not be conducted at all. Additionally, as patients take greater control of their healthcare outcomes and a greater number of clinical appointments are conducted remotely, outcome measures that allow patients to carry out assessments at home are becoming more relevant. The SSiN-VA was implemented using the 3D Tune-In Toolkit, simulating seven loudspeaker locations spaced at 30° intervals with azimuths between -90° and +90°, and rendered for headphone playback using the binaural spatialization technique. Twelve normal-hearing participants were assessed to evaluate if SSiN-VA produced patterns of responses for relative localization and speech discrimination as a function of azimuth similar to those previously obtained using loudspeaker arrays. Additionally, the effect of the signal-to-noise ratio (SNR), the direction of the shift from target to reference, and the target phonetic contrast on performance were investigated. SSiN-VA led to similar patterns of performance as a function of spatial location compared to loudspeaker setups for both relative localization and speech discrimination. Performance for relative localization was significantly better at the highest SNR than at the lowest SNR tested, and a target shift to the right was associated with an increased likelihood of a correct response. For word discrimination, there was an interaction between SNR and word group. Overall, these outcomes support the use of virtual audio for speech discrimination and relative localization testing in noise.
Keywords
Neuroscience, spatial hearing, bilateral cochlear implants, binaural performance, dual task, remote testing, speech in noise, functional testing
Relationships
Is supplemented by: https://doi.org/10.17863/CAM.76227
Identifiers
External DOI: https://doi.org/10.3389/fnins.2022.787153
This record's URL: https://www.repository.cam.ac.uk/handle/1810/335419
Rights
Licence:
http://creativecommons.org/licenses/by/4.0/
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.
Recommended or similar items
The current recommendation prototype on the Apollo Repository will be turned off on 03 February 2023. Although the pilot has been fruitful for both parties, the service provider IKVA is focusing on horizon scanning products and so the recommender service can no longer be supported. We recognise the importance of recommender services in supporting research discovery and are evaluating offerings from other service providers. If you would like to offer feedback on this decision please contact us on: support@repository.cam.ac.uk