BP Oil Spill: Explanation from BP Executive

Shortly after the Deepwater Horizon blowout began last summer’s infamous oil spill, one BP executive responded to a technical explanation with the following e-mail:

“???????????????????????????????????????????????????????????????????????????????
????????????????????????????????????????????????????????????????????????????????
????????????????????????????????????????????????????????????????????????????????
????????????????????????????????????????????????????????????????????????????????
????????????????????????”

It may have been one of the most intelligent things anyone at BP said in response to the emergency. The technical explanation in question was a nonsense description of a “bladder effect” that did not exist and had not contributed to anomalous test results found earlier. If the bogus explanation had been passed to the right people at the time of the test, those results would likely have been examined more carefully, and the disaster potentially averted. But this was neither the first nor the last time that poor judgment and decision making attended the drilling site.

Psychologists tend to be somewhat forgiving of human error. Many types of mistakes are an inevitable, if occasional, result of our mental strengths. We generally recommend system-level fixes: places where work is double-checked so that these occasional lapses can be compensated for. But the lapses around the Deepwater Horizon piled up in such profusion, and were caught so rarely, that they suggest real and unacceptable problems.

On the drilling platform, no one seemed to know who was in charge. BP had recently reorganized, separating engineering and operations leadership. The recently released National Commission Report reveals ongoing conflicts over the division of responsibilities between the two sides. In addition BP workers found themselves at odds with Transocean, the company that leased the platform. And Halliburton contractors received criticism from BP supervisors, but little clear guidance.

This combination of poor communication and unclear authority is a classic case of diffusion of responsibility. When we find ourselves in ambiguous situations—ones where there might or might not be danger—we look to those in charge, or to the crowd around us, to find clues about what to do. Unfortunately, if no one is in charge and no one else knows what’s going on, this doesn’t work. Everyone may look around, see that no one else is reacting, and decide that the problem isn’t real. So it was at BP. People who had concerns didn’t know who to tell. No one who heard those concerns felt empowered to act on them.

The other major psychological cause of the disaster was that no one believed it could happen. We tend to estimate the likelihood that something will happen based on whether it has happened before—so rare events always catch us by surprise. Worse, BP employees and consortium members were, essentially, paid on the basis of not believing. We are all vulnerable to wishful thinking. Without being deliberately deceptive or malicious, we’re biased to seek evidence that life will turn out as we wish. If preparing for a disaster later exposes us to trouble now, we find ways to believe that such preparation is not necessary.

BP drilling policies were all aimed at efficiency and speed. Every regular task on the Deepwater Horizon was compared, each day, to its record time—and bonuses were handed out for breaking those records. The official risk register, intended to help identify possible problems with the well, covered only time and money costs. All these factors motivated workers to find ways around safety tests, to assume that someone else had taken care of them, or to convince themselves that problems did not exist.

It’s difficult—although not impossible—to question the assumptions one is being paid to make. A responsible organization, therefore, has to be very careful what assumptions they pay for. Otherwise, the resulting costs can run far too high.

Leave a Reply