Home :: Books :: Professional & Technical  

Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet
Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical

Reference
Religion & Spirituality
Romance
Science
Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
The Design Inference : Eliminating Chance through Small Probabilities (Cambridge Studies in Probability, Induction and Decision Theory)

The Design Inference : Eliminating Chance through Small Probabilities (Cambridge Studies in Probability, Induction and Decision Theory)

List Price: $85.00
Your Price: $76.50
Product Info Reviews

<< 1 >>

Rating: 2 stars
Summary: Mathematical symbols do not cover all sins
Review: Dembski's book is a landmark in the field of probability and statistical inference. He tackles the most central and difficult problem: that of specification. That is, when does an improbable event call for an appropriate causal explanation, and when does it not do so? Dembski's solution, employing the theory of computation, is both highly original and very attractive. This is certainly not the last word on the subject, but it is very nearly the first one.

Dembski's work has a wide range of applications. The most important may be the problem of detecting reliable evidence for design in the biological and physical worlds. This is, however, by no means the whole story. Researchers in cryptography, archeology, SETI and other design-theoretic fields will find Dembski's book essential.

Rating: 5 stars
Summary: Book destined to endure
Review: Depsite Eli Chiprout's critical review of The Design Inference, readers can be assured that Dembski stands by his calculation and is prepared to defend it. Chiprout's chief objection seems to be that Dembski's conditional independence condition founders when human agents get into the act. Chiprout may register his complaint, but we should all note that this book and the theories it puts forth have been thoroughly vetted: it was Dembski's doctoral dissertation, it went through a grueling review process with Cambridge University Press, and the author sent preprints to probably fifty or so scholars and academics for comment. No one, and I mean **NO ONE**, corrected Dembski on what Chiprout suggests is an obvious oversight. Long after the dust of criticism settles, The Design Inference will surely stand as an important and enduring advancement in our understanding of the theory of Intellgent Design.

Rating: 4 stars
Summary: Best book by a creationist I have ever read
Review: I just finished a two-month reading group consisting of both supporters and critics of Dembski, so I finally feel competent to review this book.

While I am a naturalist and evolutionist, I greatly appreciate the writing of anybody who is intellectually honest and attempts to be rigorous: at least in this book, Dembski shows these traits with flying colors. 'The Design Inference' is Dembski's attempt to formalize valid inferences about design. That is, how can we validly infer, for any event E, that E is the product of intelligent design? Most people make such inferences all the time (how does the average person explain Stonehenge). What is the logical structure of such inferences?

Despite the math, the argument structure is actually quite simple. The way to infer that E is the product of design is to run it through what Dembski calls the 'explanatory filter.' Try to explain event E according to presently known statistical regularities (e.g., Newton's laws). If event E cannot be explained by any such statistical regularity, then it passes through the explanatory filter, and is therefore the product of design.

This argument structure is the first main weakness in Dembski's book. In employing the explanatory filter, TDI elevates an anachronistic fallacy to an imperative. Simply showing that we can't presently explain a phenomenon is not sufficient to show that it can never be explained! In the nineteenth century, the precession of Mercury in its orbit could not be explained in a well-confirmed classical worldview, but to infer design based on that would not be good science. The problems with this kind of reasoning are made clearer when we consider our early ancestors who made poor design arguments about weather patterns and illness that they couldn't explain based on physical principles.

The inferential strategy outlined above sounds rather simple, so where does all the notorious math come in? It comes in as Dembski attempts to quantitatively unpack just how to demonstrate that an event cannot be explained by a statistical regularity. For those who know some statistics, this is essentially a detailed account of how to rationally generate a rejection region in a probability distribution. The formalism emerges because Dembski's account is idiosyncratic, as he tries to show that you can generate a rejection region even *after* you have already observed the event. Most scientists would balk at this, as it would allow you to retroactively put a rejection region over the event, which to put it simply, is cheating (imagine drawing a bull's-eye around a randomly shot arrow and saying that you hit the bull's-eye by skill).

Dembski claims that it is perfectly appropriate to retroactively generate rejection regions if it would have been *possible* to specify the region before the event E actually occurred. For example, say you see someone shoot an arrow that hits a tree at a seemingly random location where there happens to be a worm. Later, however, you find out and that the person was actually hunting worms and was wearing infrared worm-hunting goggles. In such a case, you would rightly conclude that the worm was hit because of skill rather than blind luck. More importantly, it would have been possible to predict that the arrow would land on tree-worms even if you hadn't seen it happen.

While many people in our discussion group disagreed, I think this is a reasonable way to retroactively reject a chance-based explanation. However, I do *not* think that Dembski is simply describing the rejection of a hypothesis. Rather, he is describing the replacement of one hypothesis with a more reasonable alternative (in this example, the alternative to chance is that the person is a skilled worm-hunter). This leads to what I think is the second main weakness in *The Design Inference*: the engine driving the inference is not a positive theory of design, but simply the elimination of other theories. The problem is that this does not seem to conform to how people do (or should) perform design inferences. That is, people don't run through an explanatory filter, eliminating all possible statistical explanations of something, and then end up with 'design' as the last node in an explanatory filter (or explanatory sink, as I like to call it). Rather, people have a *positive theory* of intelligent agents (i.e., things with desires, beliefs, and certain capacities) and they apply this theory (or network of theories) to explain events in the world. Design inferences are not different in kind from explanations of physical, biological, social, or psychological phenomena. It is the development of such a theory and its predictions which should be the focus for Dembski.

A final note: to those interested in the debate about creationism and evolution, caveat emptor. This book contains very little direct discussion of that issue. Rather, it does what should have been done long ago: tries to outline the inferential strategy people should be employing in this debate.

Despite the two main problems outlined above, I still recommend this book to anyone seriously interested in how we make inferences about design, in particular those interested in the creation-evolution debate. While the book does no damage whatsoever to the evolutionist (partly because, as mentioned above, it does not directly address that debate) it at least makes for stimulating, thought-provoking reading. Most importantly, it will direct the creationists to be more rigorous in their arguments about design.

Rating: 4 stars
Summary: Best book by a creationist I have ever read
Review: I just finished a two-month reading group consisting of both supporters and critics of Dembski, so I finally feel competent to review this book.

While I am a naturalist and evolutionist, I greatly appreciate the writing of anybody who is intellectually honest and attempts to be rigorous: at least in this book, Dembski shows these traits with flying colors. 'The Design Inference' is Dembski's attempt to formalize valid inferences about design. That is, how can we validly infer, for any event E, that E is the product of intelligent design? Most people make such inferences all the time (how does the average person explain Stonehenge). What is the logical structure of such inferences?

Despite the math, the argument structure is actually quite simple. The way to infer that E is the product of design is to run it through what Dembski calls the 'explanatory filter.' Try to explain event E according to presently known statistical regularities (e.g., Newton's laws). If event E cannot be explained by any such statistical regularity, then it passes through the explanatory filter, and is therefore the product of design.

This argument structure is the first main weakness in Dembski's book. In employing the explanatory filter, TDI elevates an anachronistic fallacy to an imperative. Simply showing that we can't presently explain a phenomenon is not sufficient to show that it can never be explained! In the nineteenth century, the precession of Mercury in its orbit could not be explained in a well-confirmed classical worldview, but to infer design based on that would not be good science. The problems with this kind of reasoning are made clearer when we consider our early ancestors who made poor design arguments about weather patterns and illness that they couldn't explain based on physical principles.

The inferential strategy outlined above sounds rather simple, so where does all the notorious math come in? It comes in as Dembski attempts to quantitatively unpack just how to demonstrate that an event cannot be explained by a statistical regularity. For those who know some statistics, this is essentially a detailed account of how to rationally generate a rejection region in a probability distribution. The formalism emerges because Dembski's account is idiosyncratic, as he tries to show that you can generate a rejection region even *after* you have already observed the event. Most scientists would balk at this, as it would allow you to retroactively put a rejection region over the event, which to put it simply, is cheating (imagine drawing a bull's-eye around a randomly shot arrow and saying that you hit the bull's-eye by skill).

Dembski claims that it is perfectly appropriate to retroactively generate rejection regions if it would have been *possible* to specify the region before the event E actually occurred. For example, say you see someone shoot an arrow that hits a tree at a seemingly random location where there happens to be a worm. Later, however, you find out and that the person was actually hunting worms and was wearing infrared worm-hunting goggles. In such a case, you would rightly conclude that the worm was hit because of skill rather than blind luck. More importantly, it would have been possible to predict that the arrow would land on tree-worms even if you hadn't seen it happen.

While many people in our discussion group disagreed, I think this is a reasonable way to retroactively reject a chance-based explanation. However, I do *not* think that Dembski is simply describing the rejection of a hypothesis. Rather, he is describing the replacement of one hypothesis with a more reasonable alternative (in this example, the alternative to chance is that the person is a skilled worm-hunter). This leads to what I think is the second main weakness in *The Design Inference*: the engine driving the inference is not a positive theory of design, but simply the elimination of other theories. The problem is that this does not seem to conform to how people do (or should) perform design inferences. That is, people don't run through an explanatory filter, eliminating all possible statistical explanations of something, and then end up with 'design' as the last node in an explanatory filter (or explanatory sink, as I like to call it). Rather, people have a *positive theory* of intelligent agents (i.e., things with desires, beliefs, and certain capacities) and they apply this theory (or network of theories) to explain events in the world. Design inferences are not different in kind from explanations of physical, biological, social, or psychological phenomena. It is the development of such a theory and its predictions which should be the focus for Dembski.

A final note: to those interested in the debate about creationism and evolution, caveat emptor. This book contains very little direct discussion of that issue. Rather, it does what should have been done long ago: tries to outline the inferential strategy people should be employing in this debate.

Despite the two main problems outlined above, I still recommend this book to anyone seriously interested in how we make inferences about design, in particular those interested in the creation-evolution debate. While the book does no damage whatsoever to the evolutionist (partly because, as mentioned above, it does not directly address that debate) it at least makes for stimulating, thought-provoking reading. Most importantly, it will direct the creationists to be more rigorous in their arguments about design.

Rating: 1 stars
Summary: Eliminating reason through pseudo-mathematical bubble.
Review: P [ gibb(DB)] = 1, where DB stands for Dembski's book, gibb(T) denotes that text T is gibberish, and P[E] is the probability of of E.

Rating: 5 stars
Summary: Brilliant
Review: The main idea of Dembsky, in concise form, is that a combination of a very small probability with a recognizable pattern points to design as opposite to chance. Dembski maintains also that not every pattern will do but only the so-called "detachable" one. Analyzing what he means by detachable reveals that in fact it means that determining whether a pattern is usable is a subjective decision based on one's background. This makes his entire procedure of identifying design purely subjective. Furthermore, Dembski claims that while his explanatory filter may produce false negatives, it is safely ensured against false positives. That statement can be easily shown to be false. False positives can occur in at least two situations. One is when the pattern is illusory (examples are many). The second situation is when pattern is real but meaningless or irrelevant (examples are also many). Finally, whereas Dembsky presents his intelligent design theory packaged as a mathematically substantiated scientific discourse, his actual agenda is evident from his frank statement: "As Christian we know that naturalism is wrong." If you already know, why bother to offer alleged proofs and invent that explanatory filter which, essentially, is a combination of platitudes wrapped in a mathematical mantle. Pursuing an agenda with the answer known in advance is very far from a scientific approach and fits a preacher rather than a researcher.

Rating: 1 stars
Summary: Consistent inconsistency
Review: This book has been highly aclaimed by Dembski's cohorts as a revolutionary breakthrough on a par with the work by Newton. It is saturated with mathematical symbols creating an impression of inordinate sophistication. However, an elementary analysis of Dembski's opus reveals that whereas he indeed is well versed in many fields of knowledge and is a man of many talents, his book actually is full of inconsistencies. His treatment of probability makes no sense whatsoever. For example, his alleged definitions of probability and likelihood are pure tautology. According to his definition, whatever has a larger likelihood is more likely to occur. True. Also whatever is larger has a larger size, but does such a platitude require a special scientifically sounding definition? The same relates to his quasy-definitions of complexity and difficulty. Wherever there are interesting things in this book, they are not new (fo example, axiomatization of probability which is just variation on Kolmogorov's classical approach without a reference to the source). Wherever there is something new in this book, more often than not it is logically deficient (for example, the procedure Dembski suggests for the first and th second nodes of his explanatory filter). His final design inference is based on a set of arbitrary hypotheses. A completely useless book.

Rating: 5 stars
Summary: Unmasking ideological design behind Darwinism
Review: William Dembski "The Design Inference", helps us to infer the "ideological design" that lies behind Darwinism. In my mind, there is no doubt that Darwinism still exists because it has been able to protect itself from competition and cross-examination. Darwinism is afraid to expose itself to the same logic it defends: that of "the survival of the fittest". Darwinism knows it is not fit to survive in the free and open encounter with alternative explanations. Men like Richard Dawkins and Steven Jay Gould are more and more aware that Darwinism depends to its survival on the proof that a succession highly improbable events, which corresponds to a highly complex patterns, can reasonably be described as the result of mere chance. Richard Dawkins and Steven Jay Gould are more and more aware that they risk to be the laughing stock of the academia,"the last ones to know". Beeing unable to distinguish between casuality and causality, chance and design, they would never be successful detectives, forensic experts, SETI scientist, anti-trust and intellectual property lawyers.The only place where there modes of thought and inference seem to be acceptable is in the field of naturalist science, and even here, as "desperate resistants". By clarifying concepts such as design, information, intelligence, and pattern, through the concept of complex specified information,William Dembsky puts Richard Dawkins in a position that I doubt he can hold for a long time. William Dembsky allows one to understand the "clever" way Richard Dawkins uses the Occam's razor: "as long as I can speculate about naturalist explanations for reality, I will keep ignoring all the evidence of intelligent design, no matter how strong and convincig it may be". For Richard Dawkins, it is clear that "appearence of design" (Dawkins) plus "high improbability of design" (Fred Hoyle) can only be seen as a refutation of a "design inference" (Wlliam Dembsky). As a law professor, I find this (ideo)logic totally unacceptable. I'm glad to see, by reading Dembsky, that I'm not the only one.It is more and more clear that Darwinism, a complex set of naturalistic "memes", fears to become a standard example of "extinction by competition". But the game will soon be over.


<< 1 >>

© 2004, ReviewFocus or its affiliates