The oil spouting from the bottom of the Gulf of Mexico is unquestionably the result of technological failure and potentially the cause of an environmental disaster greater than any the world has seen. Such an accident, even though an extremely rare event, is something that everyone - not least the folks at British Petroleum - wishes would never have happened. But the risk of an accident is part of the price we pay for the conveniences of a sophisticated modern society that is highly dependent on science and engineering, endeavours that are incompletely understood and sometimes confused.
We certainly owe a lot to science. It tells us - with the help of devices that are products of engineering - whether it will rain next week; when to watch out for a tornado; where it is best to drill for oil; and where the ocean currents might carry it when it spills. Geologists, employing advanced instruments and techniques, have identified the abundant fields in the Gulf, the North Sea and the Middle East. But knowing where the oil is likely located does not bring it to the surface. That is where engineering takes the lead. It is engineers who design the offshore platforms and drilling rigs and the robotic equipment that are used to probe for and pump oil from a mile or more beneath the water's surface.
THERE'S NO PERFECT SYSTEM
The surprising thing about such remote operations should not be that they sometimes fail, but that they fail so infrequently. There never will be a perfect system for oil recovery or any other technological endeavour. The equipment and procedures employed are the creations of human beings and, as such, are as fallible as their creators. Engineers know this, and that is why the Deepwater Horizon oil rig installation had multiple backup safety features. Unfortunately, in this case that was not enough. There were multiple failures. What happened in the Gulf of Mexico is an illustration of Murphy's Law, which states that anything that can go wrong with a technological system eventually will.
In the wake of the Gulf spill, it is likely that there will be calls for further redundancy in oil-well safety. That may or may not be an improvement, for more complicated devices can sometimes mean less reliability - there is more that can go wrong, especially with the interaction of their parts. With technology, more and bigger are not necessarily better.
In 1981, the Kansas City Hyatt Regency's elevated walkways collapsed and killed more than 100 partygoers because the single support rod of the original design was replaced with two. The double O-ring design of the booster rockets on the space shuttle Challenger was thought to be an improvement over earlier rocket designs, which were very reliable but had only a single O-ring. The change to a double-ring design was supposed to make a reliable system even more reliable. It was more difficult, however, to make sure that the double O-rings were seated properly, and that contributed to the gas leak that initiated the explosion that destroyed the Challenger. And recall that the reaction of British Petroleum engineers to their failed attempt to contain the oil leak with a 100-tonne concrete box was next to employ a smaller, two-tonne box - not a larger one.
Technological improvements can be counterintuitive, and full of surprises themselves. We tend to build upon successful experience, thinking that adding features will result in a better product, device or method. But videocassette recorders with too many bells and whistles became the butt of jokes about owners who could not figure out even how to set the time on them. Complicating anything with more parts or steps introduces more places and ways in which it can fail.
A space shuttle has millions of parts. Before the shuttle missions began, NASA managers predicted that they would be 99.999-per-cent reliable. Engineers, who were more familiar with the machines, predicted only a 99-per-cent success rate. After two dozen successful missions, the Challenger accident proved the real figure as of that time to be 96 per cent. No matter what the technology is, even our best estimates of its success can be overly optimistic, not to mention their being influenced by what our perspective might be.
As the oil leaked and spread in the Gulf, a volcano erupted in Iceland and the resulting ash clouds polluted the atmosphere and grounded airliners in Europe. This was a natural disaster, of course, and there were few serious calls for capping the crater. We have little choice but to live with natural risks to the environment; we expend so much effort and money on lowering technological risks because we know that they were created by us in the first place and we believe that we should be able to control them. This is what has made the ongoing oil leak after the explosion, fire and sinking of the Deepwater Horizon platform so difficult to accept.
We do not want to admit that there is little we seem to be able to do to stop the oil leaking out of the failed cutoff valve beneath the Gulf waters. Even scientists cannot tell us exactly how long the oil will spew from the leaks if they are not plugged or otherwise fixed. President Barack Obama has announced that the buck stops with him, but even countless bucks thrown at the problem may not be able to stop the oil from continuing to wash up on the Gulf Coast and beyond.
According to one report, the White House had "lent" a Nobel laureate in physics, Secretary of Energy Stephen Chu, to the BP-led effort to activate the stuck device that was designed to prevent the kind of blowout that set off the disaster. The device was supposed to stop the flow of oil in the event of an accident like the one that occurred on the Deepwater Horizon drilling rig. Mr. Chu, in turn, reportedly sent a team consisting of senior officials from U.S. national laboratories to Houston, where oil-leak operations are based. The team is said to have used supercomputers to identify the nature of the problem beneath the sea but seems not to have found a silver bullet to solve it.
ACTING LIKE ENGINEERS
Scientists may identify a problem and may study it, but they will not solve it unless they act like engineers. After the Apollo 13 astronauts announced to mission control, "Houston, we have a problem," it was not NASA scientists before whom a box of assorted stuff available on the spacecraft was thrown upon a table. Had it been, the scientists might have first catalogued and classified the contents, perhaps thinking that would make the problem easier to solve. Scientists tend to study what is, whether it be a geological formation suspected of containing natural resources or a collection of moons orbiting a distant planet. They want to understand what they observe, but not necessarily to change it.
Engineers, in contrast, seek to change the world, or at least the parts of it they and their fellow humans inhabit and exploit with devices of their own making. They want to develop the technologies that bring about a world that is fertile, and comfortable, and convenient, and safe. When engineers are presented with a problem, they welcome all the knowledge about it that scientists or anyone else can provide, but they also want to solve the problem, even with incomplete understanding. This often means coming up with a solution using what is available in the time allotted.
Engineers are inventors. They come up with the darndest things, even if scientists have had neither foresight of them nor insight into how they might work. The Wright brothers, frustrated with the lack of scientific theory about wings and propellers, did their own research on their way to achieving powered flight. Nineteenth-century steamship designers, such as Isambard Kingdom Brunel, did not believe the scientists who maintained that no ship could ever carry enough coal to power it across the ocean. And Whitcomb Judson did not wait for a scientific theory of closure before inventing the slide-fastener device for his shoes, which evolved into the zipper.
It will be an engineering achievement, accomplished with or without a theory of how the blowout protector might be unstuck, or how the broken pipe may be capped, that will stop or sharply curtail the oil leaking into the Gulf waters. When the leak is stopped, there will be plenty of time to analyze the parts and processes that malfunctioned, in order to understand how they should be modified to reduce the risk of a future accident. We can continue to analyze the problem we now have, but we can't wait for scientific discoveries before we act to solve it. That is why engineers are trying what might seem sometimes to be long-shot schemes. At the moment, it is better to try than to simply ask why.
Henry Petroski is the Aleksandar S. Vesic Professor of Civil Engineering and a professor of history at Duke University. His latest book is The Essential Engineer: Why Science Alone Will Not Solve Our Global Problems.