Repository logo
 

Perception of perspective in augmented reality head-up displays

Published version
Peer-reviewed

Type

Article

Change log

Abstract

Augmented Reality (AR) is emerging fast with a wide range of applications, including automotive AR Head-Up Displays (AR HUD). As a result, there is a growing need to understand human perception of depth in AR. Here, we discuss two user studies on depth perception, in particular the perspective cue. The fi rst experiment compares the perception of the perspective depth cue (1) in the physical world, (2) on a at-screen, and (3) on an AR HUD. Our AR HUD setup provided a two-dimensional vertically oriented virtual image projected at a fi xed distance. In each setting, participants were asked to estimate the size of a perspective angle. We found that the perception of angle sizes on AR HUD differs from perception in the physical world, but not from a at-screen. The underestimation of the physical world's angle size compared to the AR HUD and screen setup might explain the egocentric depth underestimation phenomenon in virtual environments. In the second experiment, we compared perception for different graphical representations of angles that are relevant for practical applications. Graphical alterations of angles displayed on a screen resulted in more variation between individuals' angle size estimations. Furthermore, the majority of the participants tends to underestimate the observed angle size in most conditions. Our results suggest that perspective angles on a vertically oriented fixed-depth AR HUD display mimics more accurately the perception of a screen, rather than the perception of the 3D environment. On-screen graphical alteration does not help to improve the underestimation in the majority of cases.

Description

Keywords

Augmented reality, Head-up display, Depth perception, Perspective cue

Journal Title

International Journal of Human Computer Studies

Conference Name

Journal ISSN

1071-5819
1095-9300

Volume Title

155

Publisher

Elsevier BV