Tony Borroz at Wired wrote a description of a new GM design that effectively turns the windshield of your car into a monitor. When combined with night vision cameras and various object recognition algorithms, this system could effectively highlight the edges of the road at night or in foggy conditions.
This form of augmented reality can be really helpful, but it can also introduce problems. Most people intuitively understand that they can’t look away from where they’re driving for more than a couple seconds without putting themselves at risk. You can glance down at your radio or your speedometer, but you need to look back at the road quickly. That intuition is a good one, and it motivates the idea that using a head-up display will help. With a head-up display, you could project the information onto your windshield so that you don’t have to look away from where you’re going to determine your speed, change your radio station, etc.
GM’s approach will make it easier for drivers to stay on the road under less-than-ideal conditions, and that’s a great thing. Research on head-up displays in flight simulators shows that providing a virtual indication of where pilots go—a simulated “highway in the sky”—allows for more precise navigation. It effectively augments reality in a way that makes guidance more accurate.
The problem, though, is that enhancing your ability to keep your eyes facing forward and to stay on the road takes attention away from another aspect of driving (or flying): your ability to detect unexpected events (Wickens & Alexander, 2009). Looking isn’t the same as seeing, and if your attention is focused on your guidance displays, it’s not focused on the outside world.
In a remarkable demonstration of just how much you can miss, NASA researcher Richard Haines had commercial airline pilots with thousands of hours of flying experience use a high-quality flight simulator with a head-up display. During one sequence, they came in for a landing under somewhat foggy conditions. They broke through the cloud ceiling and spotted the runway, landing the plane as they usually would. Two of them never saw the other plane that was sitting on their runway and landed right through it. The jet on the runway filled much of their cockpit display.
The pilots suffered from inattentional blindness—unexpected events don’t grab attention, even if you’re looking right at them. And, inattentional blindness is counter-intuitive. Most people assume that looking guarantees seeing. GM’s approach to augmenting reality will help you stay on the road, but don’t assume that enhancing your ability to stay on the road will also help you detect unexpected objects. If you devote all of your attention to the augmented roadway navigation aids, your “situation awareness” is reduced. That’s narrowing of attention helps explain how a driver could “blindly” follow the friendly, but flawed directions of their GPS onto a pedestrian walkway and into a cherry tree.
Sources:
Wickens C.D., & Alexander, A. L. (2009). Attentional tunneling and task management in synthetic vision displays International Journal of Aviation Psychology, 19, 182-199
Haines, R. F. (1989). A breakdown in simultaneous information processing In G. Obrecht & L. W. Stark (Eds.), Presbyopia Research: From Molecular Biology to Visual Adaptation. New York: Plenum Press.
[…] we yield more of our situational awareness to the machine itself? To quote my colleagues at the Invisible Gorilla, “If you devote all of your attention to the augmented roadway navigation aids, your […]
Very interesting article. I wonder if the inattentional blindness is because the augmented reality looks too different to the scene its augmenting. Instead of emphasising aspects of the field of view, it overlays a new image on top which is what you focus on because its so much clearer.
Maybe reality should be augmented in such a way as to show a more natural looking scene that blends with reality?
[…] Daniel Simons and Christopher Chabris have an excellent new blog! And here’s an excellent post on the problems with heads up displays. […]