Prospective Immersive Human-Machine Interface for Future Vehicles: Multiple Zones Turn the Full Windscreen into a Head-Up Display
Accepted version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
The physical bottleneck of optical design and the complexity associated with the human visual system (HVS) limits the true potential of the head-up display (HUD) for use in vehicles. A full windscreen of visual information feedback is required for driving safety, and its implementation to enhance the driving experience is one of the most significant challenges. We present an immersive augmented reality (AR) HUD concept to support future vehicle design following a human-centric design approach. The limited field of view of contemporary optical solutions can be overcome by multiple display elements tiled to utilize images over the entire windscreen and so create an immersive experience. The design takes important human factors into account and improves the operator’s driving experience. Furthermore, the images are “distributed,” meaning that the physical interface generates the images via multiple optical apertures/image sources and displays these images according to HVS requirements. These configurations are tested in a laboratory environment with a replica of a real car interior and a prototype vehicle installed with distributed multiple HUD units. The proposed concept of an immersive human–machine interface (HMI) can be further extended in various forms to other parts of the vehicle interior, including surfaces and free space, which we envisage will take place in future car designs.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
1556-6080