A standard approach to safety engineering is to try to define all of the potential risks in advance and to design protocols that, if followed precisely, will avoid all of the known hazards. Such safety-by-protocol is great in principle, but it has a critical failing: The illusion of knowledge. The approach assumes that we can know and anticipate all of the potential risks.
Here’s one example of why that approach doesn’t work (I’m hoping it was a faked scene for a comedy program, but I’m not sure). Watch it all the way until the end:
Moreover, it makes no sense to build safety protocols to address all of the really remote and rare cases — we’re not great at maintaining vigilance for things that almost never happen, and doing so would divert attention from what happens all the time. We want doctors to look for the most common diagnoses rather than the one-in-a-million ones, we want safety engineers to stave off the most likely problems, and we want security personnel to look for the most frequent risks. That means we might miss some of the rare cases, but we’re going to miss those anyway…
People are great pattern detectors, and we’re built to notice what happens most often, not what happens rarely. Even when we are actively looking for rare events, we often miss them (see work by Wolfe and colleagues). Signs admonishing us to watch for motorcycles might be useful for the few moments after we see them, but really quickly our expectations reset to what we typically see: cars. For the same reason, if I warned you to watch for gorillas, and hours later showed you a video of people passing basketballs, it’s not likely to increase your chances of spotting the unexpected gorilla.
I frequently see blog posts, columns, and advertisements by consultants who use the gorilla video as a way to promote their wares, promising that their workshop, training, or presentation will help you spot all the opportunities/risks you’re missing–the metaphorical gorillas in your midst. Be wary. No form of training can magically let you notice everything. If you’re devoting all your resources to spotting rare, unexpected events, you’re going to do less well in dealing with all of the problems you face almost daily. Moreover, the very nature of rare events means that you’ll miss some of them — you can’t possibly conceive of all of the one-in-a-million possibilities in advance. Taleb’s black swans are inherently unpredictable.
There are good and bad ways to deal with these limits on our ability to notice the unexpected. The bad way is to try to build each rare event we experience into our safety protocols so that we can spot it the next time it occurs. Yes, you can be prepared to notice the gorilla the next time you try to count people passing basketballs, but you might miss something else as a result:
Anticipating rare events that have already happened is why we now have to take off our shoes at airport security — a rare event (shoe bomber) that couldn’t have been anticipated in advance leads to a silly protocol to avoid that same risk again. It gives the appearance of safety while at the same time ignoring the fact that the next threat is likely to be equally unexpected (fortunately, the next rare event, the underwear bomber, didn’t lead to the same sort of protocol change…).
Even if there are not easy ways to anticipate all the unexpected events, knowing your limits allows you to take steps to increase the odds that you will notice some of them. You’re more likely to notice the gorilla in the basketball game if you’re not focusing attention on counting passes, presumably because you have more of your attention available to pick up other aspects of a scene. Similarly, a passenger in a car should be more likely to spot unexpected events on the road because they aren’t engaged in driving (that’s why you should never complain when a back-seat driver tells you to watch out). When driving, you can turn your phone off and put it in the back seat so you won’t be tempted to use up valuable resources that might help you spot the child running into the street. If you are manning a security checkpoint, it’s a good idea to have someone watching the scene who has no task other than to watch the scene for anything out of the ordinary. If you’re designing a product or protocol, don’t assume that you can anticipate all possible risks. Instead, assume that you can’t and make sure people are as aware of their limits as possible. That won’t let you anticipate everything, but knowing that you can’t anticipate everything at least gives you the chance to maximize your odds of noticing what matters.
Wolfe JM, Horowitz TS, & Kenner NM (2005). Rare items often missed in visual searches. Nature, 435 (7041), 439-40 PMID: 15917795
Simons, D. J. (2010). Monkeying around with the gorillas in our midst: familiarity with an inattentional-blindness task does not improve the detection of unexpected events i-Perception, 1 (1), 3-6 : 10.1068/i0386
Simons, D. J., & Chabris, C. F. (1999). Gorillas in our midst: sustained inattentional blindness for dynamic events Perception, 28 (9), 1059-1074 DOI: 10.1068/p2952