Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: Of Lasting Value, Relevant to Today's Technical Maze Review: I read this book when it was assigned in the 1980's as a mainstream text for graduate courses in public policy and public administration, and I still use it. It is relevant, for example, to the matter of whether we should try to use nuclear bombs on Iraq--most Americans do not realize that there has never (ever) been an operational test of a US nuclear missile from a working missle silo. Everything has been tested by the vendors or by operational test authorities that have a proven track record of falsifying test results or making the tests so unrealistic as to be meaningless.
This book is also relevant to the world of software. As the Y2K panic suggested, the "maze" of software upon which vital national life support systems depend--including financial, power, communications, and transportation software--has become very obscure as well as vulnerable. Had those creating these softwares been more conscious of the warnings and suggestions that the author provides in this book, America as well as other nations would be much less vulnerable to terrorism and other "acts of man" for which our insurance industry has not planned. I agree with another review who notes that this book is long overdue for a reprint--it should be updated. I recommended it "as is," but believe an updated version would be 20% more valuable.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: Of Lasting Value, Relevant to Today's Technical Maze Review: I read this book when it was assigned in the 1980's as a mainstream text for graduate courses in public policy and public administration, and I still use it. It is relevant, for example, to the matter of whether we should try to use nuclear bombs on Iraq--most Americans do not realize that there has never (ever) been an operational test of a US nuclear missile from a working missle silo. Everything has been tested by the vendors or by operational test authorities that have a proven track record of falsifying test results or making the tests so unrealistic as to be meaningless.
This book is also relevant to the world of software. As the Y2K panic suggested, the "maze" of software upon which vital national life support systems depend--including financial, power, communications, and transportation software--has become very obscure as well as vulnerable. Had those creating these softwares been more conscious of the warnings and suggestions that the author provides in this book, America as well as other nations would be much less vulnerable to terrorism and other "acts of man" for which our insurance industry has not planned. I agree with another review who notes that this book is long overdue for a reprint--it should be updated. I recommended it "as is," but believe an updated version would be 20% more valuable.
Rating: ![4 stars](http://www.reviewfocus.com/images/stars-4-0.gif) Summary: Fascinating insight into what goes wrong in complex systems Review: Being a software developer, I am very interested in large systems and why they go bad. The thing that struck me most about this book was that many systems we rely on every day are so complex it is amazing that the ever work at all.
If you have read and enjoyed the book "Systemantics" by John Gall, then you will very likely enjoy this one. Although less lighthearted than Systemantics, the subject is very similar. The more complex a system is, the higher the chance of it failing. The thing that seems counterintuitive but holds true is that any attempt to make a complex system safer adds to it complexity, which makes it more likely to fail. It is the paradox of the information age we now live in.
From his humerous telling of a nuclear disaster avoidance test which ended up in an infinte loop of operators doing the same few actions over and over to some much more serious accounts of chemical plant and maritime disasters, Perrow opens the reader's eyes to the complexity and dangers of the systems that we surround ourselves with.
While this is not "light" reading, it is very informative and, I feel, timely. The only reason I did not give five stars is because it tends to drift occasionaly off topic.
This is not a "Ludditian" book warning us of the evils of technology. Instead it is a well-written review of what we have learned about system failures and a cautionary text of what we (as system designers and users) should keep in mind when deciding to rely on new technologies.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: A new way of looking at risk Review: I am in the medical field, and the recent attention on medical errors in this country make this a must-read for all those who offer simplistic fixes - there is complexity in most of our modern systems and they are hard to detect. I especially found descriptions of the decision making going on in the various incidents enlightening - certainly has stimulated me to seek out other works in this field. I would like to find a comparable work using this approach in health care.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: The only theoretical account on accidents and their causes. Review: I enjoyed very much reading this book. It is simple to read, yet profound and can be used for many purposes. I teach courses in the field of engineering systems, and one unit is dedicated to Perrow's approach. It is a must if you are interested in complex engineering systems!
Rating: ![3 stars](http://www.reviewfocus.com/images/stars-3-0.gif) Summary: Living With High-Risk Conclusions Review: I have been mulling over this review for a while now, and am still undecided on the correct rating to award this book. On the one hand Perrow offers some genuine insight into systems safety, but frequently does not understand the technicalities of the systems (or occasionally their operators) well enough to make informed decisions and recommendations. In more egregious cases he comes to conclusions that are guaranteed to reduce safety (as when he argues that supertankers should be run by committee, and the usefulness of the Captain is no more) or are merely the cherished liberal opinions of an Ivy League sociologist (he teaches at Yale) as when he argues for unilateral nuclear disarmament, government guaranteed income plans, and heroin maintenance (distribution) plans for addicts "to reduce crime." In the case of disarmament, remember this was written during the early 1980s while the Soviet Union was still a huge threat...complete nuclear disarmament would have resulted in fewer US nuclear accidents, but would NOT have made us safer as we would have been totally vulnerable to intentional nuclear attack. He has great personal animosity toward Ronald Reagan, and makes inflammatory statements in the mining section that mining safety regulations would surely be weakened by Reagan, causing many more accidents and deaths. Later in the same section, though, he concludes that mining is inherently dangerous, and no amount of regulation can make it safe. So which is it? Any of this is, at very best, folly, but regardless of political bent (he is a self avowed "leftist liberal") has absolutely no place in a book ostensibly on safety systems. As such I think portions of this book show what is so wrong in American academia today: even genuinely excellent research can be easily spoiled when the conclusions are known before the research is started. This is one of the many reasons that physical scientists scorn the social sciences, and it doesn't have to be this way. Having said all that there IS a wealth of good information and insight in this book when Perrow sticks to systems and their interactions. The book contains the finest analysis commercially available of the Three Mile Island near-disaster, and his insight about how to improve safety in nuclear plants was timely when the book was written in 1984, though many improvements have been made since then. Speaking as a commercial airline pilot, I feel his conclusions and observations about aircraft safety were generally true at the time of printing in 1984, but now are miserably out of date. (The same is true of the Air Traffic Control section.) I believe that he generally has a good layman's grasp of aviation, so I am willing to take it as a given that he has a knowledgeable layman's comprehension of the other systems discussed. As an aside, he never gets some of the technicalities quite right. For instance, he constantly uses the term 'coupling' incorrectly in the engineering sense; this is particularly objectionable in the aviation system where it has a very specific meaning to aeronautical engineers and pilots. The section on maritime accidents and safety is superbly written. Here I am not an expert, but there seems to be a high degree of correlation with the aviation section. His section on "Non Collision Course Collisions" by itself makes this book a worthwhile read. He presents very compelling information and reasoning until the very end of the section, at which point he suggests that since ships are now so big, large ships (especially supertankers) essentially should have no Captain, but should be run by committee. This is an invalid conclusion, and he offers no evidence or substantial argument to support that idea. Clearly, it is an idea hatched in his office and not on a ship (or plane.) There always needs to be a person in a place of ultimate authority in fast moving, dynamic systems, or the potential exists to have crew members begin to work at direct odds with each other, making a marginal situation dangerous. Ironically, in the very same part of the discussion where he concludes that there should be no Captain, he has hit upon the key to the problem. He mentions that he was pleased to see that some European shippers were now training their crews together as a team, and that he expected this to lower accident rates. He is, in fact, exactly right about that. Airlines now have to train crews in Crew Resource Management (CRM) in which each member of the crew has the right and obligation to speak up if they notice anything awry in the operation of their aircraft, and the Captain makes it a priority to listen to the input of others, as everyone has a different set of concerns and knowledge. In this way, the Captain becomes much less dictatorial, and becomes more of a final decision maker after everyone has had their say. It IS critical, though, to maintain someone in command, as there is no time to assemble a staff meeting when a ship is about to run aground, or a mid-air collision is about to occur. Many other well documented studies and books have come to this conclusion, and in the airline industry since CRM was introduced the accident rate has decreased dramatically. Overall, if you have a desire to understand high risk systems, this book has a lot of good information in it; however it is woefully out of date and for that reason among others, I can only recommend it with reservations. A better and much more contemporary introductory book on the subject is 'Inviting Disaster' by James R. Chiles. Remember, this book was written over twenty years ago, and much has changed since then. There is knowledge to be gleaned here, but you have to be prepared to sort the wheat from the chaff.
Rating: ![4 stars](http://www.reviewfocus.com/images/stars-4-0.gif) Summary: Important safety reference Review: I just wanted to add my disapointment that such an important book is not available. Safety assurance in the West concentrates on probabilistic risk assessment techniques which, Perrow shows, just don't stop accidents happening. Ensuring safety demands that we look at how organisations are operate at a cultural level. This is an important book and there are several of us who battle over the single copy at our University library
Rating: ![2 stars](http://www.reviewfocus.com/images/stars-2-0.gif) Summary: System safety is not an inexact science Review: I must say that I do agree with the words of David Nightengale (Saint Paul, MN) that "If you want to know about accident prevention go to the System Safety Society".
The book talks a lot on tight coupling, or dependence, of different parts of systems during different states (DEPOSE, design, equipment, procedures, operators, supplies and materials, and environment), but it all remains very qualitative. Safety and reliability engineers prefer quantitative approaches, which are abundantly available in the area of systems reliability.
The author does have an attractive style of writing. Especially the presentation on dam failures (the Grand Teton, the Malpasset and Vaiont dam failures) was interesting to read.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: Reprint needed Review: I specified this book as one of (the better of) two choices for supplementary reading in a university-level engineering course, and I'm dismayed that it's currently in this precarious print status. The book is an excellent--compelling and comprehensible-- explanation of the inherent risk of failure of tightly-coupled complex systems, in other words, the world we have created around ourselves. Engineers particularly need this insight before being unleashed on the world, because engineering as a profession (if not vocation) has taken the obligation to protect humankind from science and technology. If not a reprint or new edition, perhaps a new publisher is in order.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: Bring back Normal Accidents Review: I started teaching a doctoral seminar here in Paris on loosely coupled systems and assigned my students Normal Accidents, then had to backpedal when I found out that the book has been beamed off the planet. A conspiracy by nuclear, marine, aviation, space, genetic engineering industries? The book built on Perrow's years' of experiences on the interactions between technologies and organizations, and launched a subfield of management researchers looking at the social-technological construction of catastrophes. It'd be fun to see a second edition after Chernobyl, Challenger, Bhopal, and assorted additional catastrophes since 1984.
|