Perception of perspective in augmented reality head-up displays
Published version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
Augmented Reality (AR) is emerging fast with a wide range of applications, including automotive AR Head-Up Displays (AR HUD). As a result, there is a growing need to understand human perception of depth in AR. Here, we discuss two user studies on depth perception, in particular the perspective cue. The fi rst experiment compares the perception of the perspective depth cue (1) in the physical world, (2) on a at-screen, and (3) on an AR HUD. Our AR HUD setup provided a two-dimensional vertically oriented virtual image projected at a fi xed distance. In each setting, participants were asked to estimate the size of a perspective angle. We found that the perception of angle sizes on AR HUD differs from perception in the physical world, but not from a at-screen. The underestimation of the physical world's angle size compared to the AR HUD and screen setup might explain the egocentric depth underestimation phenomenon in virtual environments. In the second experiment, we compared perception for different graphical representations of angles that are relevant for practical applications. Graphical alterations of angles displayed on a screen resulted in more variation between individuals' angle size estimations. Furthermore, the majority of the participants tends to underestimate the observed angle size in most conditions. Our results suggest that perspective angles on a vertically oriented fixed-depth AR HUD display mimics more accurately the perception of a screen, rather than the perception of the 3D environment. On-screen graphical alteration does not help to improve the underestimation in the majority of cases.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
1095-9300