Thursday, December 17, 2015

Are you Afraid to Speak up?

During the Ohio State-Michigan football game, a flag was thrown for what the referee indicated was Michigan off-sides. Jim Harbaugh, Michigan’s coach complained to the referees that the Ohio State center was moving the ball, drawing Michigan offside. If true, the penalty should be illegal procedure against Ohio State. The refs conferred and agreed with Michigan’s coach.


This would be unremarkable except that I reviewed the play, in slow motion (thank you DVR), and did not see the center move. To me, it was clearly Michigan offside. Since the center didn’t move, it’s hard to believe that the referees saw the center move. Nevertheless, they sided with the Michigan coach.


Fortunately (I live in Ohio), the penalty was insignificant in determining the game’s outcome. But what if Harbaugh’s success had a result changing impact? It’s not far-fetched and according to Sunstein and Hastie, authors of Wiser: Getting Beyond Groupthink to Make Groups Smarter, it happens all too often.


What causes people to hide information and opinions? Research suggests that informational signals and social pressure influence what people say in a group. Information signals refers to the expressed observations or opinions by another person within a group (unless the source is previously viewed as unreliable). The phenomenon is so strong that most people will ignore their own observations to adopt that of prior public statements. Even worse, there tends to be a cascade effect in which more people support the expressed opinion, even if they privately expressed an opposing point of view.


In an organization, altering one’s opinion to go along with that of others can be disastrous. Letting a group go down the wrong pathway to avoid potential conflict can lead to bad investments, setting unrealistic performance targets, and making commitments that are not achievable. In healthcare, it can lead to inefficiency, unnecessary tests, treatments, and costs.


The problem happens in management activities more often than we’d like to acknowledge. Have you ever sat in a meeting to review performance data and observed a stalled project that continues to underperform for several months? The data shows that no progress has been made in a performance improvement project for 4 months. Each month the project team offers an explanation (usually without supporting data) that suggests that efforts have been underway but unforeseen problems have prevented objective evidence of improvement. The reviewing group accepts the story and moves on to the next item on the agenda.


When I investigate a problem, I review meeting minutes and then interview participants to get more information about what occurred during the meeting (minutes tend to be very terse). When I ask participants about repetitive reports of limited progress, they admit that they had concerns but didn’t speak up. When I ask them why they didn’t speak up, they cannot provide an explanation. Unconsciously, they relied on prior information or concerns about their relationships with others. This led to decisions that permitted limited progress to continue indefinitely. The project team’s explanation for the undesirable data month after month set the pattern for the meeting. And everyone else, to avoid conflict, accepts the story.


But some individuals who had reviewed the data outside of the group meeting, even if it included a written explanation from the project team, would surely would become concerned about the project’s progress. Someone must have considered the two critical questions: “do they really understand the problems?” and “can’t something be done to get this project on track?” Yet when the group met to discuss the data, these same individuals kept quiet and allowed the group to decide to support the current project direction.


Such behavior is incompatible with high reliability.


One characteristic of high reliability organizations (HRO) is “sensitivity to operations”, which requires observing and acting upon data that may be in conflict with expectations at an early stage. Sensitivity to operations would be useless if members of the group altered their observations based on the expressed opinions or observations of others. In an HRO, information must be shared and digested (including validated) before the group decides what to do with the information. If individuals are intimidated or self-censor so that they withhold information or fail to raise conflicting points of view, the group is blind to important information that might result in a better solution.


To be sensitive to operations, aspiring HROs must promote an open communication environment that promotes uncensored sharing and analysis of information to achieve a more diverse and broader perspective on the issues at hand.


In healthcare, this is similar to the patient safety culture movement which encourages people to speak up about concerns but the emphasis is on speaking up about concerns in bedside care. A healthcare HRO embraces patient safety and encourages speaking up to protect patients but would also recognize that failing to speak up influences how groups decide and the quality of the decisions they reach. In healthcare organizations, this behavior affects management meetings, project teams, quality committees and even board meetings.

To cultivate high reliability, healthcare organizations will need to evaluate how well conflict is tolerated. Getting people to raise differing opinions requires cultivating a tolerance of conflict. A good place to start is by looking at how review meetings address lagging performance. Natural behavior can inadvertently lead to ineffective group processes that undermine decision-making. Because these processes are natural, leaders who fail to tend to them will inevitably get disappointing performance.

No comments:

Post a Comment