Canada Kicks Ass
Being powerful like brain damage: Why systems fail

REPLY



BeaverFever @ Sun Apr 01, 2018 8:34 pm

$1:
How having power is like having brain damage: a new book explains why systems fail

By Chris ClearfieldExcerpted from Meltdown
Andras Tilcsik

Sun., April 1, 2018

Modern-day failures — from the catastrophic, like a train crash, to the more amusing, like mixing up a Best Picture Oscar ballot — have much to do with our increasingly complex systems. In their new book Meltdown, Chris Clearfield and Andras Tilcsik, examine what these types of disasters have in common and what we can do to avoid them in future.

It’s difficult to be a dissenter. We often feel the need to go along with what others in our group think, and neuroscience shows that this desire for conformity isn’t just the result of peer pressure. It is wired into our brains.

In one experiment, scientists used functional magnetic resonance imaging (fMRI) to see how our brains react when we hold an opinion that deviates from our group’s consensus. It turns out that two things happen when we go against the grain. First, a brain region involved in error detection becomes very active. The nervous system notices a mistake and triggers an error message. It’s as though your brain is saying: Hey, you’re doing something wrong! You need to make changes! At the same time, an area of the brain that anticipates rewards slows down. Your brain says: Don’t expect that you’ll be rewarded! This won’t work out well for you!

“We show that a deviation from the group opinion is regarded by the brain as a punishment,” said the study’s lead author, Vasily Klucharev. And the error message combined with a dampened reward signal produces a brain impulse indicating that we should adjust our opinion to match the consensus. Interestingly, this process occurs even if there is no reason for us to expect any punishment from the group. As Klucharev put it, “This is likely an automatic process in which people form their own opinion, hear the group view, and then quickly shift their opinion to make it more compliant with the group view.”

This played out in a fascinating study led by Emory University neuroscientist Greg Berns. Participants looked at pairs of three-dimensional objects, each shown from a different angle, and had to tell if the objects in each pair were identical or different. Each participant was put in a group of five volunteers, but the other four people in the group were actually actors who worked for the researchers. Sometimes the actors answered correctly, but sometimes they all gave the wrong answer. Then it was the real participant’s turn to answer, and the researchers used a brain scanner to capture the moment.

Andras Tilcsik, co-author of Meltdown: Why Our Systems Fail and What We Can Do About It.
Andras Tilcsik, co-author of Meltdown: Why Our Systems Fail and What We Can Do About It. (David Chang Photography)
People went along with the group on the wrong answer more than 40 per cent of the time. That’s not too surprising — many experiments demonstrate people’s willingness to conform. What is interesting is what the brain scanner showed. When people agreed with their peers’ incorrect answers, there was little change in activity in the areas associated with conscious decision-making. Instead, the regions devoted to vision and spatial perception lit up. It’s not that people were consciously lying to fit in. It seems that the prevailing opinion actually changed their perceptions. If everyone else said the two objects were different, a participant might have started to notice differences even if the objects were identical. Our tendency for conformity can literally change what we see.

And when people went against the group, there was a surge in activity in brain regions involved in the processing of emotionally charged events. This was the emotional cost of standing up for one’s beliefs; the researchers called it “the pain of independence.”

When we shift our opinions to conform, we’re not lying. We may not even be conscious that we’re giving in to others. What’s happening is something much deeper, something unconscious and uncalculated: our brain lets us avoid the pain of standing alone.

These results are alarming because dissent is a precious commodity in modern organizations. In a complex, tightly coupled system, it’s easy for people to miss important threats, and even seemingly small mistakes can have huge consequences. So speaking up when we notice a problem can make a big difference.

Dissent makes no difference if no one listens. And listening to a dissenting voice can be as hard as speaking up.

It turns out that the effect of being challenged — of having your opinions rejected or questioned — isn’t just psychological. Research shows that there is a real, physical impact on the body. Your heart beats faster and your blood pressure rises. Your blood vessels narrow as if to limit the bleeding that might result from an injury in an impending fight. Your skin turns pale, and your stress level skyrockets. It’s the same reaction you would have if you were walking in the jungle and suddenly spotted a tiger.

This primal fight-or-flight response makes it hard to listen. And, according to an experiment conducted at the University of Wisconsin–Madison, things get even worse when we are in a position of authority.

In the study, three strangers had to sit around a table in a lab and discuss a long list of issues, like an alcohol ban on campus or the need for mandatory graduation exams. It got boring quickly. Luckily, after 30 minutes, a research assistant came in with a plate of chocolate chip cookies, a nice relief from the task. What participants didn’t know is that the plate of cookies was part of the experiment. In fact, it was the most crucial part.

Half an hour earlier, just before the session started, the researchers had randomly picked one of the three strangers and told the group that that person would serve as an evaluator of sorts. The role came with no real power; it just involved assigning “experimental points” to the other two people in the group based on their contributions to the discussion. These points had no substantive meaning at all. They didn’t affect participants’ compensation or their chances of being invited back for future studies. And because the study results were anonymous, no one outside the lab would even know how many points anybody received.

It was a fleeting, trivial sense of power. The evaluators knew they were chosen by sheer luck and not because of their skills or experience. They knew their evaluations had no real power.

Yet, when the plate of cookies arrived, they behaved very differently from the others. The plate didn’t have enough seconds for everyone, and the evaluators were more likely than their peers to take an extra cookie. It took just a little taste of being the boss to make people feel entitled to a scarce resource.

“Everybody takes one cookie,” explained Dacher Keltner, one of the researchers who ran the study. But who takes the first extra cookie? “It’s our person in the position of power who reaches out, grabs the cookie, and says, ‘that’s mine.’ ”

Chris Clearfield, co-author of Meltdown: Why Our Systems Fail and What We Can Do About It.
Chris Clearfield, co-author of Meltdown: Why Our Systems Fail and What We Can Do About It. (e. leitzell)
When the researchers later watched the videotapes of the study, they were struck not only by how much the evaluators ate but also how they ate. They displayed signs of “disinhibited eating,” psychology-speak for eating like an animal. They were much more likely to chew with their mouths open and scattered more crumbs, both on their faces and on the table, than the other participants.

The cookie study is a simple experiment, but it has big implications. It suggests that even the faintest sense of power — being in charge of something clearly inconsequential — can corrupt. And it’s just one of many studies drawing the same conclusion. Research shows that when people are in a position of power, or even just have a sense of power, they are more likely to misunderstand and dismiss others’ opinions, more likely to interrupt others and speak out of turn during discussions, and less willing to accept advice — even from experts.

In fact, having power is a bit like having brain damage. As Keltner put it, “people with power tend to behave like patients who have damaged their brain’s orbitofrontal lobes,” a condition that can cause insensitive and overly impulsive behaviour.

When we are in charge, we ignore the perspectives of others. This is a dangerous tendency because more authority does not necessarily equal better insights. A complex system might reveal clues that a failure looms, but those warning signs don’t respect hierarchy. They often reveal themselves to folks on the ground rather than to higher-ups in the corner office.

Robert, a large, muscular man in his 60s, arrived for a routine checkup at his dentist’s office in downtown Toronto. Robert had always preferred an 8 a.m. appointment and was never late. And he always looked healthy and full of energy when he walked into the waiting room and greeted Donna, the office’s longtime receptionist.

But when Donna saw him that morning, something didn’t feel right to her. His face was red, and he was sweating. She sat him down and asked if he was OK. “Yeah, I’m fine,” he told her. “I just didn’t sleep well. I had indigestion. And my back hurts a little.” He had looked up his symptoms online, but he didn’t want to bother his doctor.

It sounded innocent enough, but Donna had a strange feeling that something was amiss. Though the dentist, Dr. Richard Speers, was in the middle of performing a procedure on another patient, she went in to see him. “Dick, Robert is here, and something just doesn’t feel right to me. Can you come out and take a look at him?”

“I’m really busy right now,” Speers replied.

“I really think you should see him,” Donna insisted. “Something isn’t right.”

“But I’m in the middle of this,” said Speers.

“Dick, I want you to see him.”

Speers gave in. He had always trained his staff — his dental assistant, his hygienist, and even his receptionist — to speak up when something didn’t feel right. He thought they might catch something he would miss.

He took off his gloves and went to the waiting room. He asked Robert a few questions: had he taken any Tums for his indigestion? Did it help? Was there any pain in his left arm? Any discomfort between his shoulder blades? Robert had taken Tums, but that didn’t help. He did feel some pain in his left arm, around his wrist. And, yes, he had upper back pain.

“Is there a history of heart disease in your family?” Speers asked.

“Yes, my father and brother both died of a heart attack,” Robert replied.

“How old were they?”

“They were both my age.”

Without delay, Speers sent him to the cardiac centre of Toronto General Hospital, just down the street. Robert was 18 hours into a heart attack. Triple bypass surgery saved his life.

In his free time, Dr. Speers is a pilot and aviation enthusiast, and he’s been on a mission to teach dentists safety lessons from the airline industry. The biggest lesson he learned from pilots was to get people lower down in the hierarchy to speak up and to get higher-ups to listen.

Since the 1970s, a series of fatal accidents have forced changes in the airline industry. In the bad old days, the captain was the infallible king of the cockpit, not to be challenged by anyone. First officers usually kept their concerns to themselves, and even when they did speak, they would only hint at problems. Organizational researcher Karl Weick described their attitude like this: “I am puzzled by what is going on, but I assume that no one else is, especially because they have more experience, more seniority, higher rank.”

But as the industry grew, aircraft, air traffic control and airport operations became too complex for this approach to work. The captain was king, but the king was often wrong. There were too many moving parts and they were too intricately connected for one person to notice and understand everything.

Captains and first officers usually alternate flying the airplane. The flying pilot manipulates the primary controls. The non-flying pilot talks on the radio, runs through checklists, and is expected to challenge the flying pilot’s mistakes. About half of the time, the captain is the flying pilot, and the first officer is the non-flying pilot. In the other half, the roles are switched. So, statistically, roughly 50 per cent of accidents should happen when the captain is flying the plane, and 50 per cent when the first officer is in charge of the controls. Right?

In 1994, the NTSB published a study of accidents due to flight crew mistakes between 1978 and 1990. The study reported a staggering finding. Nearly three-quarters of major accidents occurred during the captain’s turn to fly. Passengers were safer when the less experienced pilot was flying the plane.

Of course, it’s not that captains were poor pilots. But when the captain was the flying pilot, he (and most often it was a “he”) was harder to challenge. His mistakes went unchecked. In fact, the report found that the most common error during major accidents was the failure of first officers to question the captain’s poor decisions. In the reverse situation, when the first officer was flying the plane, the system worked well. The captain raised concerns and pointed out mistakes and helped the flying pilot understand complex situations. But this dynamic worked only in one direction.

All this changed with a training program known as Crew Resource Management, or CRM. The program revolutionized the culture not just of the cockpit but also of the whole industry. It reframed safety as a team issue and put all crew members — from the captain to the first officer to the cabin crew — on more equal footing. It was no longer disrespectful to question the decisions of a superior — it was required. CRM taught crew members the language of dissent.

Parts of CRM sound obvious, even outright silly. An important part of the training, for example, focuses on a five-step process that first officers can use to raise a concern:

1. Start by getting the captain’s attention. (“Hey, Mike.”)

2. Express your concern. (“I’m worried that the thunderstorm has moved over the airport.”)

3. State the problem as you see it. (“We might get some dangerous wind shear.”)

4. Propose a solution. (“Let’s hold until the storm is clear of the airport.”)

5. Get an explicit agreement. (“Does that sound good to you, Mike?”)

These steps sound barely more sophisticated than what we might teach a child about how to ask for help. Yet they were rarely followed before CRM came along. First officers would state a fact (“the thunderstorm has moved over the airport”) but would hesitate to get the captain’s attention and express how concerned they were, let alone propose a solution. So even when they tried to express a grave concern, it often sounded more like a casual observation.

CRM was a huge success. Since it took hold in U.S. commercial aviation, the overall rate of accidents involving flight crew mistakes has declined sharply. And whether the flying pilot is the captain or the first officer no longer matters. In the 1990s, just half of the accidents — rather than three-quarters — happened when it was the captain’s turn to fly.

The program works because it gives everyone, from baggage handlers to pilots, a sense of purpose. The message is that every single person can make an important contribution to safety, and everyone’s views are important. And, as Daniel Pink explains in his book Drive, this approach — giving people a sense of purpose and autonomy — is often the most effective way to motivate them.

The ideas behind CRM have also spread to other fields struggling with increasingly complex operations, like firefighting and medicine. In a 2014 article in the Journal of the Canadian Dental Association, Dr. Speers and his co-author, dentistry professor Chris McCulloch, described what Crew Resource Management would look like in the dental office.

“Dentists need to minimize hierarchy in their operatories by creating an atmosphere in which all personnel feel comfortable speaking up when they suspect a problem,” they wrote. “A team member may see something the dentist is oblivious to, such as undetected caries [cavities] or a tooth that is about to receive inappropriate treatment. Dental team members should be encouraged to cross-check each other’s actions, offer assistance when needed, and address errors in a non-judgmental fashion.”

But learning to embrace dissent is hard. When Crew Resource Management was introduced, many pilots thought it was useless psychobabble. They called it “charm school” and felt it was an absurd attempt to teach them how to be warm and fuzzy. But as more and more accident investigations revealed how failures to speak up and listen led to disasters, attitudes began to shift. Charm school for pilots has become one of the most powerful safety interventions ever designed.

Excerpted from Meltdown: Why Our Systems Fail and What We Can Do About It by Chris Clearfield and András Tilcsik. Copyright © 2018 by Chris Clearfield and András Tilcsik. Published by Allen Lane Canada, a division of Penguin Random House Canada Limited. Reproduced by arrangement with the Publisher. All rights reserved.


https://www.thestar.com/news/insight/20 ... -fail.html

   



REPLY