Home :: Books :: Computers & Internet  

Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet

Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical
Reference
Religion & Spirituality
Romance
Science
Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
Secrets and Lies : Digital Security in a Networked World

Secrets and Lies : Digital Security in a Networked World

List Price: $17.95
Your Price: $12.21
Product Info Reviews

<< 1 .. 8 9 10 11 >>

Rating: 5 stars
Summary: goes past the technology
Review: Bruce has rightfully earned his reputation by explaining the technology of security. In this book he goes past that by explaining that security is a system, a process, and does it in his typical style that makes it completely understandable and actually a fun read. If you're responsible for security matters, you may not like seeing various 'social engineering' hacks exposed, but it's information that you and everyone using a computer these days needs to be aware of. Once again, Bruce brings a straightforward style to bear and makes sometimes difficult subject matter clear to the reader.

Rating: 5 stars
Summary: Make your friends read this book!
Review: If you're using the internet, you need to know this. I've been following the security community for quite some time and am constantly amazed at how many of even the basic principles are almost unknown outside that community. Hopefully this book will help change that.

Rating: 5 stars
Summary: Question Assumption Our Silicon Snake Oil System
Review: Copy of Email Response Sent to Author & Posted to a diffent news group.

<snip> Thanks. If you could post this review on Amazon and B&L, that would be great.

Thanks again, Bruce Schneier

At 11:20 AM 8/30/00 -0500, you (David R. Hibbeln) wrote:

Mr. Schneier wrote one hell of a good tome. He had me going to my 7" thick Webster's International to look up a few words, which has been of late, an all too rare occurrence.

He has come up with an excellent read that forces you to question many of the assumptions we have all made about this silicon snake oil system we love and hate. He takes off were Grace Hopper left off in her quote, "Before 1942 the world was simple, we did not have systems." By examine the basics he sure has come up with so great things to ponder.

This was one of those computer books you will read from cover to cover with awe !

He has every right to be proud of this book.

Now if I can only figure out how to get his message across to my SME clients...

My Opinion Only: Mr. David R. Hibbeln [...]

Rating: 5 stars
Summary: Extremely Interesting Read
Review: I was lucky enough to get one of the first books off press and Schneier's book is everything that he said it would be in his email. Provocative and stimulating.....the subject must be read by all in technology and business positions. I'm now going to reread his original classic, Applied Cryptography.

Rating: 5 stars
Summary: What's security *really* about?
Review: This is an important book for both techies who work in computersecurity, and anybody who cares about the security of our digital world. It's about issues much larger than bits, bytes, or even protocols.

For me, it pulled together all the threads about physical and computer security that have been running through the various industries for years into one sensible view.

Read it. It'll help you spot idiot ideas, and, even more important, help you argue against them.

Rating: 5 stars
Summary: Classic Schneier
Review: If you're a fan of Bruce Schneier, whether it be his live presentations, his books, or Crypto-Gram, then you'll love this book. Bruce has shifted his focus away somewhat from the deep technical details that he has in "Applied Cryptography." In this book, he delves more into the hows and whys of security, and focuses heavily on the trade-offs that reality forces security people to make. This book is a must-read for anyone responsible for making security decisions.

Rating: 4 stars
Summary: Beware the Author's Motives
Review: Firstly, let me say I have a great deal of respect for Bruce Schneier. "Applied Cryptography" is a superb book. True to form, so is "Secrets and Lies". However, both books exist purely as a vehicle to advance the authors career. Isn't it interesting that when Schneier focused his business interests predominantly on cryptography-related consulting work that he was able to release two editions of "Applied Cryptography" in rapid succession? However, today when Bruce is now in the "managed security monitoring" business suddenly he can't find the time to update AC and instead produces "Secrets and Lies": a book that takes an unashamedly non-technical approach to giving a broad overview of the status of computer and information security. And guess what the overwhelming theme of the book is? "No system will ever be secure and all security will inevitably fail: you must take another approach." Of course this statement is true -- just not in the absolute, black and white sense that Schneier presents it. The purpose of this book is purely and simply to gently nudge and guide quasi-technical IT managers towards the obvious and overly-simplistic conclusion that since all security is doomed to failure, the approach that must be taken is to try to handle that failure when it occurs. Bruce's recommended solution: his managed network security business.

Of course this is all fair enough, albeit slightly underhanded, and "Secrets and Lies" is a highly readable, enjoyable and (mostly) technically accurate book. It's just not the book we need! What we need is a technical book, aimed at the same people who read and loved "Applied Cryptography", which shows how various security vulnerabilities come into being and thus how they can be minimised. Such a book would bring much more benefit to the world than "Secrets and Lies". And Bruce, while you're at it, please update "Applied Cryptography"!

Rating: 4 stars
Summary: Very good, but with some caveats
Review: I finished the entire Bruce Schneier book "Secrets and Lies". I thought it was excellent but also I think it suffers from some very deep flaws.

1) While Schneier goes a long way to prove his point that open-source, non-proprietary software is, in general, more secure than closed-source, proprietary software, he fails to consider critical differences between types of open-source projects. All open-source, in other words, is not created equal. There are critical distinctions between the open-source projects undertaken by ANSI or other standards-making bodies and the open-source world of projects like, say, linux.

Under ANSI, standards are created by a consortium of business, government and industry bodies, usually employing the top people in the business. This consortium is structured like a giant software company designing a proprietary product, with all the checks and balances, redundancies, code testing, spec designs, etc. ANSI then asks for feedback from the entire user community, with the whole process from specs to product often taking years. Contrast this with the world of nobodies and semi-somebodies that often lead open-source linux and other projects like Mozilla. Such projects are more or less led by hobbyists in an ad-hoc fashion since the resources to do proprietary-style software development are not there.

The question is how much of open-source linux's reputation is riding on the reputation of open-source ANSI? How often is the quality between the two confused?

2) Schneier fails to fully consider problems with his suggestion that insurance companies market liability insurance to handle the cost of security breaches. They know the risk business, he claims, and, therefore, they are in a position to estimate the risks of such security. A laudable idea, except what happens if insurance companies know their business well enough not to provide any coverage at all? There is, in fact, a historical analogy: vaccines.

In 1976, an unusual epidemic of "swine flu" occurred at Fort Dix. The federal government decided to vaccinate the entire country. The Congressional Budget Office predicted that, with 45 million Americans inoculated, there would be 4,500 injury claims and 90 damage awards, totaling $2 million. Despite these statistics, insurance companies refused to participate. Amid denunciations of corporate greed, Congress decided to provide the insurance.

It turned out that the CBO was about half right. A total of 4,169 damage claims were filed. However, not 90 but more than 700 lawsuits were successful and the total bill to Congress came to $100 million, 50 times their initial estimate. Insurance companies knew their business well.

The point that Schneier needs to understand is the concept of "strict liability" that has replaced the older concept of "negligence." Under negligence, a plaintiff had to prove intent or fault. Under strict liability, a plaintiff does not. In effect, the theory says that damage has occurred and that someone has to pay. How does a cyberspace security company insure itself under such circumstances, at least at a premium that is not the value of the entire company? It cannot and like most of the vaccine business, such cyber security companies would simply leave the market.

3) Equally silly are some of the analogies Schneier uses to describe the state of the software industry and his laments about the lack of institutions to enforce solutions: "Skyscraper 1.0 collapses, but we will get it right in Skyscraper Version 1.1" or "a defective automobile gets recalled, but no one recalls software" or "we have the FDA, the UL or other institutions but nothing similar for software."

A skyscraper collapsing is not an example of a security problem. It is an example of a functionality problem. A skyscraper collapsing because a plane crashed into it is an example of a security problem. A skyscraper collapsing on its own means someone did not pay enough attention in architecture school: not enough schooling in statics or finite element analysis. But no amount of schooling could anticipate a plane crashing into a building, let alone prevent a collapse...unless an architectural equivalent of the Multics operating system were erected with all the functionality problems that such a building would have.

The same is true for automobiles. A car running off the road because the brakes stop working or the accelerator sticks is an example of a functionality problem. A car running off the road because another car hits it is an example of a security problem. And no amount of engineering is going to prevent an accident (or car thefts, for that matter.)

It is just as pointless to expect regulations or some third-party government body to handle this problem. Product recalls, Underwriters Laboratories and the FDA all deal with functionality problems, not security problems. Even safety issues, which could be likened to protecting valuable assets (just like security), deal primarily with functionality (recalling a car because the engine computer could shut down your engine while driving is a functionality problem while an engine computer susceptible to some device that opens your doors is a security problem; making sure a drug's side effects don't kill you is a functionality problem while making sure the packaging is tamper-evident is a security problem).

This should be obvious to Bruce since he himself admits that security testing is impossible, so what good is some outside regulator going to do, except institutionalize low standards? Automobile crash tests are one notorious example. Car manufacturers make a big deal out of them but what do they really test? An offset test, where half the front portion of a car is smashed against a heavy steel block just tells us how a car would behave if smashed into a heavy steel block. Specifically, since the mass of the block is greater than the car, the test simply measures how the cars structure reacts to the force generated by that car's own mass and acceleration. It tells us nothing about how it would react if, say, hit with a similar mass accelerated at the same rate as the approaching auto (presumably, it would do a lot worse).

Ironically, government crash test ratings seem to operate under the same theory as the Orange Book. A Windows machine can get a C2 rating...as long as it doesn't have a floppy drive and is not networked. Similarly, a Honda Prius can get a government five-star crash-test rating...as long as it doesn't get hit by a 4,500 pound Lincoln Town Car or a 6,000 pound Cadillac Escalade. Can the government guarantee that such cars are not going to share the streets with a Prius?

4) The most glaring problem in Schneier's book, however, is something that I call the "craft mentality." When I worked at Encyclopedia Britannica as a research analyst, I noticed that an inordinate amount of time and effort was spent by the management staff trying to preserve the quality of the research Britannica was putting into its products. Less time was spent trying to figure out how to price the products to capture the value of that research, or even trying to determine if that quality was evident or useful to the user (Articles on "Calculus", for example, were written by mathematicians and looked like they were taken out of graduate textbooks, obviously incomprehensible to the average user). Even in the face of hemorrhaging money, management still insisted on maintaining the standard...until they were replaced. In Britannica's case, research analysis was treated as a craft that needed to be preserved, even if that craft got in the way of selling encyclopedias.

Schneier's book suffers from the same problem. There appears to be an underlying need to preserve and pursue security research, security knowledge and other related academic disciplines...to preserve and pursue the basic "craft" to which security reduces. The problem is at what point does the practice of security as a craft interfere with real security? To put it another way, how is it possible to have even rudimentary risk management of cyber space if everyone, including academics, has an unlimited right to know?

We are in the situation of zero-day exploits, script-kiddies, malware, viruses and other problems precisely because of the craft mentality.

Consider the old model of submitting known vulnerabilities to CERT, which would then propagate that information to the industries involved. This process was slow and cumbersome and did not result in the security (i.e. craft) improvements that the submitting parties wanted. In the hopes that it would stir security (i.e. craft) improvements, the vulnerabilities were announced to the world, to be done with as anyone pleased.

Plenty of reasons are given for doing this...all of them specious. Claiming that the initial vulnerability is a problem is pointless if security vulnerabilities are ubiquitous, impossible to prevent, and even impossible to test. Improvements can be made, but true or perfect security is impossible. Claiming that the truly bad guys already know the vulnerabilities so it doesn't matter if everyone knows is equally pointless. No one really knows if the bad guys know the vulnerabilities. It is merely conjectured that they probably do. And the probability of the bad guys knowing is far more secure than the certainty of the bad guys knowing once the vulnerabilities are announced to the world (Imagine a national security agency with this attitude. All the other really bad national security agencies know, so it does not matter if everyone knows. Gee...that works). Claiming to be for publishing vulnerabilities while being against building exploits is pointless if public knowledge of those vulnerabilities leads to the building of the exploits. It is a distinction without a difference. Claiming that security by obscurity is not very good security does not imply that security by transparency is any better.

Discipline needs to be brought back into security. Vulnerability announcements should go through the proper channels, should be treated like a national secret, and should carry very, very stiff penalties for violations. Research should be supervised. The spectacle of Def Con in Vegas and the hacker quarterlies needs to stop with most if not all of those people going to jail and all of them not ever being allowed near a computer again (they can all work at Subway). The law works. Digital content providers, for example, are defending their property rights with heavy handed lawsuits, not quietly going into other lines of business as Schneier suggests.

None of this will happen if Schneier and others insist on maintaining their right to know and to spread that knowledge indiscriminately.

"Shooting the messenger" is the common analogy, but it is a false one. The problem is not that the messenger is bringing bad news. The problem is that the messenger is bringing the bad news to all of the wrong people. That needs to be brought under control.

Hopefully, Schneier will address these problems in another edition of his book.


<< 1 .. 8 9 10 11 >>

© 2004, ReviewFocus or its affiliates