Home :: Books :: Professional & Technical  

Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet
Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical

Reference
Religion & Spirituality
Romance
Science
Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
Inviting Disaster: Lessons From the Edge of Technology

Inviting Disaster: Lessons From the Edge of Technology

List Price: $15.95
Your Price: $10.85
Product Info Reviews

<< 1 >>

Rating: 5 stars
Summary: Excellent General Introduction to Systems Safety
Review: 'Inviting Disaster' is a compelling and easy to read book. It is an introduction to accident theory for generalists, and is as interesting (perhaps more so) to nontechnical people as it is to engineers and the like. James Chiles discusses several major accidents (Challenger, Three Mile Island, Ocean Ranger, etc.) in well executed chapters with substantial background from previous precursor accidents or incidents. One reviewer seems to believe that this is a flaw, but I disagree. The reviewer seems to believe, for instance, that the R101 (a dirigible, not a blimp, as the reviewer wrongly states) is totally irrelevant to Challenger. In fact R101 was the Challenger of it's day, and the social, managerial and technological pressures that ultimately led to the R101 disaster ultimately led to Challenger as well. Chiles ties this theme together in a seamless manner in chapter after chapter.

This book is not a rigorous technical analysis of the individual disasters with the engineering and math associated with formal inquiries and technical (AAIB, NTSB, etc.) investigations. What it does better than any of the technical inquiries could ever do, though, is make a clear a compelling case for the problems that led to each of the accidents covered, treating man-machine interface issues with particular grace.

I have long been associated with the more technical aspects of accident investigation and safety systems, but have to say that while there are more technical accounts available for all of these accidents, if you are looking for an entry level (but complete) overview of accidents and systems safety, you can't go wrong with this book.

Rating: 4 stars
Summary: How little things can cause great disasters
Review: Accidents and disasters are often caused by simple, random events or the change in a normal sequence of actions, any one of which could affect the outcome. Had the path of the Air France Concorde been slightly different, or the piece of titanium not fallen off a DC-10, or the plane left a tad earlier or later, or a sealant been used in the fuel tanks, or any one of any other seemingly unimportant events taken place, the plane's tire would not have struck the titanium and a piece of tire would not have opened a substantial leak in the plane's fuel tank, and the passengers and crew would still be alive today.
Another related book worth reading is Normal Accidents by Charles Perrow. Perrow had studied several major accidents and concluded that some forms of technology are more open to chains of failure and that adding more safety systems can actually lead to an increased likelihood of an accident because of the increase in complexity. The systems become so "tightly coupled" that a failure in any part of the system almost inevitably leads to a chain of unmanageable and uncontrollable events.
Chiles goes Perrow one further and makes recommendations as to how training and people can prevent the accidents by breaking one of the links in the chain. It requires that individuals throughout the organization be empowered to call decisions into question or to halt actions they believe to be of concern. He observed several industries as air traffic control centers, and aircraft carriers, (not to mention helicopter repair of high-tension lines!) which have impressive safety records despite a high level of coupling and danger.
It's a fascinating book that examines why disasters happened and what lessons can be gleaned from those tragedies. For example, the explosion of the steamboat Sultana killed hundreds at a time (1865) when Americans were seemingly inured to disasters of all kinds ("between 1816 and 1848, 233 explosions on American steamboats had killed more than two thousand people"). Steamboats were constantly being destroyed by boiler explosions, and, despite industry objections, the federal government had issued all sorts of controls and inspections. In the case of the Sultana, the captain was in a hurry, he wanted to pack as many prisoners (released from Andersonville prison) on board as possible (being paid [$] per soldier and [$] per officer). The ship was way overloaded, which contributed to the boiler explosion because when the ship turned, its topheaviness caused the water level in the boiler to shift beyond safe levels. In addition, rather than have a crack in one boiler properly fixed, the captain had insisted on a patch that normally would have been fine, except that it was slightly thinner than the boilerplate on the rest of the boiler. That would have been OK, except that no one thought to change the setting on the emergency blowout valve to reflect the thinner metal of the repair, so a sequence of decisions that individually would have been unimportant resulted in a sequence that killed far more, on a percentage basis, than the 9/11 attacks.
It is possible to conduct accident-free operations, but Chiles says that it means changing normal operational culture and mindset. For example, challenging authority becomes crucial in preventing aircraft crashes and other jobs where people have to work as a team. The airlines have recognized this and no longer is there a pilot in command; the term now is pilot flying the plane with each pilot required to question the judgment of the other pilot if he/she thinks the pilot flying has made an unsafe move or decision.
I learned about the extraordinary safety record of companies that use helicopters to make repairs on high-tension electrical lines while the current is still on. That would certainly loosen my sphincter. The pilot hovers the craft within feet of the conductive lines while the electrician leans out on a platform, hooks a device to the line that makes the craft and everyone on it conduct up to 200,000 volts (they have to even wear conductive clothing), and makes repairs to the line. They have never had an accident in twenty-five years of doing this. Safety is paramount, they anticipate the unexpected, and everyone is an equal partner in the team and expected to point out conditions that might be unsafe. "A good system, and operators with good `crew resource management' skills, can tolerate mistakes and malfunctions amazingly well. Some call it luck, but it's really a matter a resilience and redundancy." Failing to have this resiliency can have tragic consequences. On December 29, 1972 an L-1011 crashed on approach to Miami because a light bulb indicating whether the landing gear was down had burned out and the entire four-man crew became involved in changing the bulb. They did not notice that someone had bumped the throttle lever releasing the autopilot that was supposed to keep them at two thousand feet, and the air traffic controller who noticed the deviation in altitude did not yell at them to pull up, not wanting to annoy the crew, but simply asked if everything was coming along. The plane crashed killing everyone on board.
Another key element is that people must be clear in speaking and writing, "even if doing so necessitates asking people to repeat what you told them. . . We know that people will try to avoid making trouble, particularly any trouble visible to outsiders, even though they are convinced that catastrophe is near." Chiles sites numerous instances where committed individuals went outside normal channels to get additional perspectives or assistance and prevented catastrophe. Those individuals always knew the leadership would back up their independent decisions even if they were wrong.
I have just scratched the surface. This book should be recommended reading for everyone.

Rating: 5 stars
Summary: An up close and human look at some infamous foul ups
Review: If you want to know why the Concorde crashed or how things got so fouled up at Chernobyl or what went wrong at Three Mile Island, this very readable book is a good place to start. Chiles gives us diagrams, step-by-step chronologies, and a very human narrative to illuminate these and scores of other technological disasters in a way that makes it excruciatingly clear that most of them could have been prevented.

What these disasters have in common is human error, of course, but Chiles reveals that there were also foreshadowings and warnings of the horrors to come in the form of cracks, sagging roofs, parts that didn't quite fit, maintenance shortcuts taken, capacity limits reached, etc., that should have tipped off those in the know that something terrible was about to happen. Additionally, virtually all of the disasters happened because more than one thing went wrong.

Among the horror stories told in detail are:

The harrowing tale of the sinking of the drill rig Ocean Ranger in a North Atlantic gale in 1982, a disaster caused in part because somebody forgot to close the shutters on portlight windows;

The Challenger space shuttle blow-up, which Chiles compares with the crash of the British hydrogen-filled dirigible R.101in 1921. Both were "megaprojects born out of great national aspirations"and both went forward "despite specific, written warnings of danger." (p. 67);

The Hubble Space Telescope fiasco in which a lens is incorrectly ground thereby partially "blinding" the telescope, a multi-billion dollar error that could have been prevented with just a little testing. In this chapter (subtitled: "Testing is Such a Bother") Chiles shows how disasters happen because proper tests are simply not performed;

An out of control police van that killed parade watchers in Minneapolis in 1998 when an off duty police officer not completely in the driver's seat inexplicably gunned the engine instead of hitting the brakes. This accident was in part caused by an alteration to "Circuit 511" that controls both the brake lights and (unbeknownst to the mechanics) an electric shift lock on the vehicle. Chiles notes that "The odds of pedal error go up when drivers are elderly, and also when drivers turn around in the seat to back their cars up." (p. 242);

The explosion at the Union Carbide plant in Bhopal, India in 1984--"the worst chemical disaster of all time"--that killed thousands of people. Chiles calls this a case of "Robbing The Pillar," a reference to the practice in coal minds of mining the coal pillars holding up the walls of the mines.

This is a book for the engineer in your soul, a treatise for the worry-wart on your shoulder, a recounting of responsibility for the accountant in your heart, and cautionary tales for the fear monger in the pit of your stomach. Chiles is gentle in focusing blame, but he does indeed name names and point fingers. He also gives us a prescription for preventing future disasters. In addition to the need to perform regular maintenance, and follow safety procedures to the letter, etc., he suggests how we might prevent "cognitive lock," the blinding sense that we've all experienced, that insists that THIS is the problem and not something else, or that such and such is what needs to be done, when in reality something else will work. He also advises that near misses ought to be reported and not swept under the rug (p. 202) and that "redline running" is dangerous and that under pressure we are sometimes apt to do the wrong thing, and therefore procedures to follow during crisis should be spelled out in advance.

Rating: 4 stars
Summary: Interesting Reading But Not Technical
Review: If you were expecting to find technical understanding of how best to improve a plant, don't buy this book. If you want a qualitative understanding of why disasters occur, this is the book. For a quantitative, engineer's perspective, refer to "Managing Risk and Reliability of Process Plants," by Mark Tweeddale. I found this book very insightful and easy to read. After reading this book, I was encouraged to go on to more technical text. After reading this book I decided to make it a career goal NOT to be one of the engineers who designed an oil plateform where the controls could be shorted out by sea water with the fill-valves open on failure. Dumb!

Rating: 5 stars
Summary: Inviting Disaster: Lessons from the edge of technology
Review: Inviting Disaster is a must read not only for those of us working in the field, but for every member of the public. The lessons to be learned apply to all of us in our daily lives. Each of us pack our children into automobiles, 2-tons of steel and propel them down highways at speeds up to 80 mph. We use electricity in kitchens and bathrooms which have the power to instantly electrocute. We operate lawn mowers and power equipment which can easily kill or maime. We climb trees and ladders to heights from which a slip can be deadly. The potential for personal disaster in these situations is no less significant than it has been for astronauts, pilots, architects and submariners. We all need to appreciate the power and responsibility we have and the need to consider the implications of our actions or lack of action. Perhaps if the managers of the agents whose suspicions were aroused before 9/11 had done so, the WTC would still be standing. Perhaps if those agents had been similarly concerned, they might have followed the potentially career limiting but heroic steps of some of Mr. Chiles subjects and 2000 people would be returning to their families. Perhaps if we have a designated driver at holiday celebrations, we will return home to our families or allow others to return to their's. Similar things can be said regarding recent financial disasters. This book is important for all of us.

Rating: 3 stars
Summary: How did this get to publication??
Review: Let me first mention how poorly this book has to be written for me to actually sit down and type up a commentary. This is without a doubt the worst written book I've read this year (and I read an enormous quantity of non-fiction each week).

The book is divided into chapters which ostensibly focus on one major engineering catastrophe. But instead of fleshing out the problems associated with the cardinal disaster of each chapter, the "author" jumps to another seemingly unrelated engineering disaster.

In the chapter on the Three Mile Island disaster, instead of saying something ot the effect of, "the valve failed... and was critical to the build-up of disaster" the author jumps back to the invention of the safety valve. This would not be terribly unusual except for the extent to which the vignette is unrelated to the chapter as a whole. Towards the the end of the first 1/3 of the book, you can almost sense when the author's attention span is going to fail and you are going to be thrust at Mach 3 into the wall of a distracting incidental anecdote several hundreds of years preceding the topic at hand. I am not saying that the reader would not like background information on things, but the organization or lack thereof, is in a word "maddening." I am not kidding. I promised I'd give the book a fair chance but I get to mad at the poor writing and lack of editing to continue.

In a chapter dedicated to the Challenger disaster, the author juxtaposes the disaster of a blimp earlier in the century. This would not be a problem, had the author discussed the blimp disaster then compared it with the challenger debacle. Unfortunately, the author's misguided writing style presents both incidents simultaneously. One paragraph is about the old airship, the next about the challenger then one about the blimp, then one about the Challenger... It seems like nothing now, but you will go nuts reading it. You never get your bearings and by the end of the chapter, should to be determined enough to actually continue, you really don't come away with anymore understanding of it than you would from watching a news clip about the decline in the population of birds of prey in Northern Minnesota.

Then, there are the cryptic sentences which, at best, are serious editorial lapses, at worst, a sad commentary on the state of education in this country. Because I thought I might have suffered a temporary stroke and lost the ability to read I started to mark these sentences as they flew by. I hoped that I could test later on if it was halting prose or if I had suffered an aneurysm. Page 52, "Come inside the beige paneled control room at TMI-2, eight seconds after the valves slammed shut an block off the steam making pipes." **What???** Then the one on page 167, "The flight attendants spelled each other at the job of hanging onto the captain, though they believed the man was probably dead by now." What am I missing? The whole book reads like this. How did this clunker get into publication? If any of my student had handed me this piece, I'd have failed them on the spot.

Rating: 5 stars
Summary: You Don't Need to be an Engineer to Love this Book
Review: Let's face it: our lives are dominated by technology. Luckily we rarely have to give it all that much thought. However there are those times when technology fails us. Planes crash, disasters like Bhopal happen, let's not forget Three Mile Island adn the Space Shuttle too. What happened? How about similar disasters and problems throughout history and little known disasters such as oil rig blowouts? How did our fail safe world fail?

The book Inviting Disaster attempts to answer those questions and does so in an entertaining and informative manner. Written by an engineer who understands hot to communicate with everyone, this book is a fun read with anyone with an interest in this topic.

I have absolutely no experience in anything mechanical, but came away after reading this book much better informed. In addition to the mechanical explanations, Chiles provides very informative glimpses into history. He's a good engineer, a good historian, and a good writer.

Rating: 4 stars
Summary: If you don't believe in Dilbert
Review: Think that the managers depicted by Dilbert author Scott Adams don't exist in real life? That the mistakes attributed to the Bumbling Pharmaceutical Company in my text,"Manager's Guide to Design and Conduct of Clinical Trials," could never have occurred? Or that the best way to stop terrorists is with a trillion dollar anti-missile system, not a central computer for the INS? You've got to read this book; denial (just being a team player) won't help a lemming and it won't help you

Rating: 5 stars
Summary: Understanding system safety
Review: This books is a fascinating introduction to the topic of systems safety and why large scale technology fails. Covering systems as diverse as space exploration, offshore oil drilling, aircraft, chemical processing plants and the nuclear power industry, it is full of interesting examples of how small, apparently insignificant factors, can coincide and bring down the entire system, often with great loss of life and huge financial cost. As our technologies become more complex and widespread, we must become more sophisticated in our approach to maintaining their safety, and learning from past disasters is the first step in this process. This book provides a string of interesting past-disaster lessons. Although not as theoretically in-depth as Perrow's "Normal Accidents" or Sagan's "Limits of Safety", the book is very readable and will be of interest to the novice as well as those who have read Perrow and Sagan, simply because of its colourful and diverse examples and interesting discussion.

Rating: 5 stars
Summary: A must read!!!
Review: This is a very interesting and quick read. It starts out terrific with a riveting telling of the collapse of the WTC towers, a north sea oil rig, and 3 mile island. But somewhere halfway into the book it starts to get muddled. The stories mix and match and some paragraphs don't make sense no matter how many times you read them. He also begins lecturing on systems failure as much as (or more than) telling stories. This guy knows how to explain things, but I get the feeling the 2nd half of the book was rushed to completion and/or an editor went insane. A strong point is in lots of good examples of near-misses and times when things went right in bad circumstances.
Lots of good stories and lessons here but structured poorly so some parts you have to fight through.


<< 1 >>

© 2004, ReviewFocus or its affiliates