+1 (208) 254-6996 essayswallet@gmail.com

WEBC02 05/30/2017 17:25:15 Page 36

cheating until outsiders “nally caught on. Desperate maneuvers to hide the truth and delay the inevitable made the day of reckoning more catastrophic.

Don't use plagiarized sources. Get Your Custom Essay on
WEBC02 05/30/2017 17:25:15 Page 36
Just from $13/Page
Order Essay

COPING WITH AMBIGUITY AND COMPLEXITY Organizations try to cope with a complicated and uncertain world by making it more simple. One approach to simpli”cation is to develop better systems and technology to collect and process data. Another is to break complex issues into smaller chunks and assign slices to specialized individuals or units. Still another approach is to hire or develop professionals with sophisticated expertise in handling thorny problems. These and other methods are helpful but not always suf”cient. Despite the best efforts, as we have seen, surprising—and sometimes appalling—events still happen. We need better ways to anticipate problems and wrestle with them once they arrive.

Making Sense of What’s Going On Some events are so clear and unambiguous that it is easy for people to agree on what is going on. Determining whether a train is on schedule, a plane landed safely, or a clock is keeping accurate time is fairly straightforward. But most of the important issues confronting leaders are not so clear cut. Will a reorganization work? Was a meeting successful? Why did a consensual decision back”re? In trying to make sense of complicated and ambiguous situations, humans are often in over their heads, their brains too taxed to decode all the complexity around them. At best managers can hope to achieve “bounded rationality,” which Foss and Webber (2016) describe in terms of three dimensions:

1. Processing capacity: Limits of time, memory, attention, and computing speed mean that the brain can only process a fraction of the information that might be relevant in a given situation.

2. Cognitive economizing: Cognitive limits force human decision makers to use cognitive short-cuts—rules of thumb, mental models, or frames—in order to cut complexity and messiness down to manageable size.

3. Cognitive biases: Humans tend to interpret incoming information to con”rm their existing beliefs, expectations, and values. They often welcome con”rming information while ignoring or rejecting discon”rming signals (Foss and Webber, 2016).

Benson (2016) frames cognitive biases in terms of four broad tendencies that create a self-reinforcing cycle (see Exhibit 2.3). To cope with information overload, we “lter out

36 Reframing Organizations

notwMM1iÑb Mr processing

WEBC02 05/30/2017 17:25:15 Page 37

most data and see only what seems important and consistent with our current mind-set. That gives us an incomplete picture, but we “ll in the gaps and make everything “t with our current beliefs. Then, in order to act quickly instead of getting lost in thought, we favor the easy and obvious over the complex or dif”cult. We then code our experience into memory by discarding speci”cs and retaining generalities or by using a few speci”cs to represent a larger whole. This reinforces our current mental models, which then shape how we process experience in the future.

To a greater or lesser degree, we all use these cognitive short-cuts. In the early days of his presidency, Donald Trump’s tweet storms and off-the-cuff communications provided prominent examples. In March, 2017, he tweeted that his predecessor, Barack Obama, was a “bad (or sick) guy” for tapping Trump’s phones prior to the election. Trump apparently based this claim on an article from the right-wing website Breitbart. Since the charge aligned with Trump’s world view, he “gured it must be true and continued to insist he was right even after investigators concluded it never happened.

Exhibit 2.3. Cognitive Biases.

Cognitive Challenge Solution Risk

Too much data to process

Filter out everything except what we see as important and consistent with our current beliefs

Miss things that are important or could help us learn

Tough to make sense of a confusing, ambiguous world

Fill in gaps, make things !t with our existing stories and mental models

Create and perpetuate false beliefs and narratives

Need to act quickly Jump to conclusions—favor the simple and obvious over the messy and complex

Quick decisions and actions lead to mistakes and get us in trouble

Memory overload Discard speci!cs to form generalities or use a few speci!cs to represent the whole

Error and bias in memory reinforce current mind-sets and biases in information- processing

Source: Adapted from Benson, 2016.

Simple Ideas, Complex Organizations 37

WEBC02 05/30/2017 17:25:15 Page 38

Decisions, whether snap judgments or careful calculations, work only if we have adequately sized up the situation. As one highly placed female executive reported to us, “I thought I’d covered all the bases, but then I suddenly realized that the rest of my team were playing football.”

Managers regularly face an unending barrage of puzzles or “messes.” To act without creating more trouble, they must “rst grasp an accurate picture of what is happening. Then they must move to a deeper level, asking, “What is really going on here?”When this step is omitted, managers too often form super”cial analyses and pounce on the solutions nearest at hand or most in vogue. Market share declining? Try strategic planning. Customer complaints? Put in a quality program. Pro”ts down? Time to reengineer or downsize.

A better alternative is to think, to probe more deeply into what is really going on, and to develop an accurate diagnosis. The process is more intuitive than analytic: “[It] is in fact a cognitive process, faster than we recognize and far different from the step-by-step thinking we rely on so willingly. We think conscious thought is somehow better, when in fact, intuition is soaring !ight compared to the plodding of logic” (DeBecker, 1997, p. 28).

The ability to size up a situation quickly is at the heart of leadership. Admiral Carlisle Trost, former Chief of Naval Operations, once remarked, “The “rst responsibility of a leader is to “gure out what is going on . . . That is never easy to do because situations are rarely black or white, they are a pale shade of gray . . . they are seldom neatly packaged.”

It all adds up to a simple truth that is easy to overlook. The world we perceive is, for the most part, the image we construct in our minds. Ellen Langer, the author of Mindfulness (1989), captures this viewpoint succinctly: “What we have learned to look for in situations determines mostly what we see” (Langer, 2009, p. 33). The ideas or theories we hold determine whether a given situation is foggy or clear, mildly interesting or momentous, a paralyzing disaster, or a genuine learning experience. Personal theories are essential because of a basic fact about human perception: in any situation, there is simply too much happening for us to attend to everything. To help us understand what is going on and what to do next, well-grounded, deeply ingrained personal theories offer two advantages: they tell us what is important and what is safe to ignore, and they group scattered bits of information into manageable patterns. Mental models shape reality.

Research in neuroscience has called into question the old adage, “Seeing is believing.” It has been challenged by its converse: “Believing is seeing.” The brain constructs its own images of reality and then projects them onto the external world (Eagleman, 2011). “Mental models are deeply held internal images of how the world works, images that limit us to familiar ways of thinking and acting. Very often, we are not consciously aware of our mental models or the effects they have on our behavior” (Senge, 1990, p. 8). Reality is therefore what

38 Reframing Organizations

WEBC02 05/30/2017 17:25:15 Page 39

each of us believes it to be. Shermer (2012) tells us that “beliefs come “rst, explanations for beliefs follow.” Once we form beliefs, we search for ways to explain and defend them. Today’s experience becomes tomorrow’s forti”ed theology.

In November, 2014, two police of”cers in Cleveland received a radio report of a “black male sitting on a swing pulling a gun out of his pants and pointing it at people” in a city park (Holloway, 2015). Arriving at the site, one of”cer spotted the suspect and saw him reach for his gun. The of”cer immediately shot and killed the suspect. The of”cer might have responded differently if the radio report had included two additional details. The caller who made the initial report had said that the suspect might be a juvenile, and the gun was probably fake. The gun was a toy replica of a Colt semiautomatic pistol. The victim, Tamir Rice, was 12 years old, but, at 195 pounds, might have looked like an adult on a quick glance.

Perception and judgment involve matching situational cues with previously learned mental models. In this case, the perceptual data were hard to read, and expectations were prejudiced by a key missing clue—the radio operator had never mentioned the possibility of a child with a toy. The of”cer was expecting a dangerous gunman, and that is what he saw.

Impact of Mental Models Changing old patterns and mind-sets is dif”cult. It is also risky; it can lead to analysis paralysis, confusion, and erosion of con”dence. This dilemma exists even if we see no !aws in our current thinking because our theories are often self-sealing. They block us from recognizing our errors. Extensive research documents the many ways in which individuals spin reality to protect existing beliefs (see, for example, Garland, 1990; Kühberger, 1995; Staw and Hoang, 1995). In one corporate disaster after another, executives insist that they were not responsible but were the unfortunate victim of circumstances.

Extensive research on the “framing effect” (Kahneman and Tversky, 1979) shows how powerful subtle cues can be. Relatively modest changes in how a problem or decision is framed can have a dramatic impact on how people respond (Shu and Adams, 1995; Gegerenzer, Hoffrage, and Kleinbölting, 1991). One study found that doctors responded more favorably to a treatment with “a one-month survival rate of 90 percent” than one with “a 10 percent mortality rate in the “rst month,” even though the two are statistically identical (Kahneman, 2011).

Many of us sometimes recognize that our mental models or maps in!uence how we interpret the world. It is less widely understood that what we expect often determines what we get. Rosenthal and Jacobson (1968) studied schoolteachers who were told that certain students in their classes were “spurters”—students who were “about to bloom.” The so- called spurters, who had been randomly selected, achieved above-average gains on

Simple Ideas, Complex Organizations 39

Order your essay today and save 10% with the discount code ESSAYHELP