Illusions occur because we unconsciously believe our assumptions, biases and intuitions are accurate. When this belief is left unchecked (and usually is), it leads to endless errors in thinking, reasoning, decisions and judgment.
To Err is Human
We’ve all made critical decisions that turned out disastrous.
I feel as though I have more experience than most in this area, so after making a string of bad decisions, I was determined to find out if decision-making was something we could control or if it was largely determined by the vagaries and unpredictability of life.
Almost all bad decisions result from confusing “feeling” right with “being” right
We tend to think that bad decisions are relegated to impulsivity, inexperience and youth. Not so!
No one is immune from making bad decisions, and experience, I.Q., and education have nothing to do with the end result.
In fact, the daily news is a running report of bad decisions made by the rich, informed, and powerful just as much as the rest of us.
After making a string of insanely bad decisions based on hubris, and false-certainty, I was determined to understand if decision-making could be learned, or if it was up to the vageries and unpredictablitity of life.
After years of research, I’ve found that we can make critical decisions with absolute certainty. We can know if our decision is good or bad when we make it; we don’t have to wait for the outcome.
Amos Tversky and Daniel Kahneman are arguably two of the most influential people of our time because they virtually revolutionized how we think … about how we think.
We’re often in error, but rarely in doubt ~ Daniel Kahneman
In 2011, Nobel Laureate Daniel Kahneman wrote the international bestseller, THINKING, FAST and SLOW. In this fascinating book, he shares several personal stories about his experiences with judgments and decisions. The story below perfectly illustrates the challenges we create whenever we try to navigate our world solely through the lens of our perception and interpretation.
The Illusion of Validity … a story
Many decades ago, Kahneman served in the Psychology Branch of the Israeli Defense Forces. On occasion, he was tasked with evaluating candidates for officer training. One of the tests, termed “The Leaderless Group Challenge,” was conducted on an obstacle field. Eight soldiers — strangers to each other and with all insignia removed — were instructed to lift a long log from the ground and haul it to a wall about six feet high. Their challenge was getting everyone to the other side of the wall without any participants or the log touching the wall. If that happened, they had to start all over again.
There were several ways to accomplish the task — the most common was to have four soldiers hold the log on an angle, like a giant fishing rod, while the other soldiers, one at a time, would shimmy up the pole as it hung over the wall, and then if all went well, drop to the other side.
When four soldiers were successfully on the other side, the log was tilted over the wall so that the remaining soldiers could jump up to the overhanging log and shimmy down the log to the other side … thus completing the task.
Kahneman and his colleagues observed the men in action. They noted who would attempt to lead but was rebuffed, who cooperated, who was stubborn, arrogant, patient, persistent and so on. In other words, they saw the whole gamut of human traits, emotions and attitudes, and because of this, they were pretty confident that each man’s true nature had been revealed.
Armed with these facts and the first-hand revelation of soldiers in action, Kahneman and his colleagues summarized who they felt were the best candidates for officer training. Whenever they agreed that a particular soldier was an obvious choice, they became convinced of their assessment beyond all reasonable doubt. The apparent leader was the soldier who took over when the group was in trouble and led the team over the wall. What else do you “wanna” know?
There was absolutely no doubt that he would be equally effective as an officer in battle.
Here’s where it gets interesting
Every several months, Kahneman and his colleagues were given feedback on their assessments from the commanders of the officer training school. Much to their surprise and embarrassment, the soldiers they had so confidently recommended as candidates had proved to be no better than hit-and-miss. According to the officers of the training school, their assessments, opinions, and forecasts were only slightly better than a blind guess!
Here’s the crux of the story
Whenever Kahneman and the other evaluators received the dismal results of their assessments, they were naturally discouraged … but that only lasted as long as it took to bring a new batch of candidates to the wall. As soon as the new group was under observation, Kahneman and his colleagues were once again “convinced” that, this time, their observations and recommendations of individual soldiers clearly revealed the best candidates.
Blind to Evidence
As Kahneman noted, after many years of retrospection, the global evidence of our inability to predict the best soldiers for officer training failed to weaken our confidence. Oh, sure, we acknowledged that our track record was less than perfect overall, but that uncertainty was fleeting. Just as soon as we returned to the wall and watched the soldiers in action, we again became convinced that this time, our evaluations of “this” or “that” candidate were spot on … especially when everyone arrived at the same conclusion.
As Kahneman reflected, “this was a clear instance of the representative heuristic in action.” We observed one hour of a soldier’s behaviour — in an artificial situation — and because of that, we felt we knew he would not only rise to the challenges of officer training but also be a bonafide leader in the heat of battle.
A physical analogy
Look at the picture below for a “physical” analogy of what happened.
Which one of these three trucks is the largest?
The obvious and intuitive answer would be to say the one on the left, but if you take a ruler and measure these three trucks, you’ll see they’re all the same size.
A powerful visual illusion dominates your impression of the size of these trucks. Your perceptual system (of its own volition, you have no “conscious” say in the matter) has automatically substituted a 3-dimensional image for a 2-dimensional one, creating the illusion of different sizes. Now here’s the thing; even though you know these trucks are all the same size, you still see them as different sizes.
In this instance, this “substitution” causes no harm, but as we’ve just seen in Kahneman’s case, we make cognitive substitutions all the time, only we don’t realize it.
The wall in Kahneman’s story is a perfect metaphor. The soldiers were able to scale the physical wall. Still, Kahneman and his colleagues were unable to climb the cognitive wall, the cognitive wall of their biases, assumptions, intuitions, perceptions and false beliefs.
This story perfectly exemplifies the challenge of sorting our biases, assumptions, and ancient survival reactions from objective reality, especially when making those dozen or so decisions that shape our lives.
“Certainty” is often illusory
Even when we “know” we’re mired in cognitive traps and delusions, unless we have a process to de-bias our thinking and judgments — we often forge ahead despite ourselves.
Scientific studies have repeatedly shown that we’re not rational creatures making rational decisions. We don’t base our beliefs on an impartial evaluation of “evidence” and information. Instead, we unconsciously wrestle the available information into submission.
We force fit “facts” to how we think things should be, how we would like them to be … and that human tendency — IF LEFT UNCHECKED — will lead to a lifetime of decisional error.