Exploring Embodied Resources in Gaze in Human-Robot Collaborative Environments
Published version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
jats:titleAbstract</jats:title> jats:pAmong various types of embodied resources in humans, gaze, started with mutual gaze, plays a major role in embodied cognition. In addition to establishing relationships during interactions, gaze further portrays information about the level of engagement in a dyadic interaction. Hence the gaze and gaze-related behaviors such as averted gaze can be used as cues to make decisions regarding an interaction. This holds true for a human and a robot during human-robot interaction (HRI) as well. Hence proactive robots could evaluate human gaze as a parameter to achieve situation-awareness. In this work, we present the outcomes of several experiments aimed at evaluating such gaze behavior of human-human nonverbal interactions and other behaviors initiated as a result during dyadic interactions. The possibility of evaluating situations by such behavioral responses of individuals as cues, is also examined. We further compared the relationships between gaze behavior of humans during HRI and human-human interaction (HHI). We considered the properties of existence and aversion of gaze, as gaze behaviors during this study. Results of these experiments indicate interesting tendencies in verbal and nonverbal human behavior in initiating of an interaction in both HHI and HRI. The behavioral patterns related to gaze, observed during the study were analyzed using statistical methods and critical observations are highlighted. The potential of analyzing gaze behavior in displaying messages to the outside world during HRI is discussed.</jats:p>
Description
Keywords
Journal Title
Conference Name
Journal ISSN
1757-899X