Home :: Books :: Professional & Technical  

Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet
Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical

Reference
Religion & Spirituality
Romance
Science
Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
Algebra of Probable Inference

Algebra of Probable Inference

List Price: $25.00
Your Price: $25.00
Product Info Reviews

<< 1 >>

Rating: 4 stars
Summary: Cox understood Keynes better than Ramsey and de Finetti but
Review: Cox understood Keynes's logical approach to probability better than either Ramsey or de Finetti but ultimately fails to improve substantially on Keynes's work due to his overlooking what the fundamental nature of probability is in Keynes's general system.For Keynes probabilities are generally not precise single number answers.Probabilities are indeterminate,inexact,imprecise or indefinite interval(set)estimates that are represented by intervals.These intervals require two numbers to specify the probability relation.Single number answers require symmetry of evidence or the Principle of Indifference(POI).Contrary to Cox ,the POI is not a latent frequentist principle.Cox also fails to recognize that Keynes's primary axiomatic treatment on pages 135-138 of the A Treatise on Probability(1921) of addition and multiplication is an attempt by Keynes to provide an AXIOMATIC TREATMENT THAT WOULD HOLD FOR BOTH PRECISE AND IMPRECISE PROBABILITIES.In chapter 8 of the TP,Keynes provides an axiomatic treatment for the relative frequency interpretation of probability,while in chapter 15,Keynes provides additional axioms needed to treat probability as precise estimates(single numbers) only.Finally,Cox has overlooked Keynes's extension of his discussion of the logical concept of the weight of the arguments,in chapter 6,to Keynes's mathematical counterpart in chapter 26,which Keynes called the weight of the evidence.Keynes then defined an index for weight of evidence and incorporated it into his decision rule c,called a conventional coefficient of weight and risk.Keynes's approach is superior to the maximum entropy approach since this approach only applies to statistical evidence alone.(See Cox,ft.17,pp.104-105).

Rating: 5 stars
Summary: The best introduction to logical probability theory
Review: That Gian-Carlo Rota and I both admired this book largely
explains why I have my present position at MIT. But I
cannot write book reviews the way Rota did.

Why should the conventional sum and product rules of
probability hold when probabilities are assigned, not
to *events* that are *random* according to their
relative frequencies of occurrence, nor to subsets of
populations as proportions of the whole, but rather
to *propositions* that are *uncertain* according to the
degree to which the evidence supports them? The tenet
that the same rules should apply to such "degrees of
belief," whether they are "subjective" probabilities or
"logical" probabilities, is the essence of Bayesianism.
The relative merits of Bayesian and frequentist methods of
statistical inference have been debated for decades. But
seldom is the question with which I started this paragraph
addressed. Several answers to that question have been
proposed. Richard Cox's book embodies one of them.

Many writers on foundations of statistical inference are
callously imprecise about the kind of topic dealt with
in this book. Cox is their antipode, writing not only
clearly, but supremely efficiently, beautifully, perhaps
sometimes poetically, about functional differential
equations and about delicate philosophical questions.

Cox also deals with the relationship between entropy and
distributive lattices. Shannon entropy is to distributive
lattices as probability is to Boolean algebras. I do not
think Cox was familiar with standard work on lattice theory.
He never uses the word "lattice," nor other standard
lattice-theory nomenclature.

Rating: 5 stars
Summary: Like a ten-pound textbook, in only 130 pages
Review: This book contains the fundamental argument justifying Laplace's original theory of probability. Laplace justified his theory by basic intuitive considerations, which left it open to attack on philosophical grounds. Here, R. T. Cox shows how Laplace's theory is the logical consequence of two very simple, almost unavoidable axioms.

Cox begins the book by discussing his axioms, and then expressing them as functional equations. The solution of these functional equations develops the theory to the point at which Laplace began his own development.

(In general, the probability of a proposition is conditional on the truth of some other proposition. An item of particular interest here is that while most Bayesian expositions call this a priori true proposition "prior information", Cox calls this proposition the "hypothesis". This term seems to me to be more sensible, because we are rarely absolutely certain about our prior information. We take our "prior information" to be true, not because we are certain it is true, but as a conjectural point of departure for the subsequent calculation.)

Cox continues the development of the theory by relating the notion of probability to information entropy. He gives a definition for systems of propositions and shows how entropy is related to the uncertainty as to which of the propositions in the defining set of the system is true. (By hypothesis, at least one proposition in the defining set is true.)

Cox finishes the book with a section on expectation. He shows here how the theory he has developed encompasses all of the standard results of expectations found in other theories of probability.

This book looks deceptively thin, but packs the punch of a ten-pound textbook. It requires multiple passes (or, perhaps, one pass, closely read) in order to get all of the information out of it. It is definitely an exposition of an algebra, that is to say, an abstract symbolic method of calculation. Sometimes Cox gives concrete examples to illustrate the abstract reasoning, and sometimes he doesn't. Where he doesn't, the reader is left to puzzle out the concrete consequences of the abstract reasoning. I'm not sure if this is good or bad, but I'm leaning towards good, even though it does make my brain hurt.

Rating: 5 stars
Summary: Degrees of belief as an extension of Boolean logic
Review: This is a great, great book that I'm absolutely ecstatic to see back in print. I was introduced to it when I was in graduate school (mathematics) and rooming in the house of a physics professor who swore by Richard Threlkeld Cox's account of subjective probability. I haven't had a copy of it in my hand for nearly twenty years; I happened across this page today and ordered it at once. So pardon me while I gush:

What Cox accomplishes in this deceptively slim volume is amazing. He places Bayesian probability theory on an axiomatic foundation, as a natural extension of Boolean logic, identifying probabilities with degrees of subjective belief in propositions rather than directly with frequencies of events (though he also argues that the subjectivist interpretation accords with the frequentist interpretation whenever the latter makes sense at all).

Essentially, he shows that the ordinary laws of probability theory are normative laws of thought that apply to degrees of belief in propositions, and that we have to conform to them if we want to think consistently.

If you like math and logic books, you'll find this one eminently readable; I haven't seen it in years and yet I still remember the stunning clarity of Cox's rigorous exposition.

This is the book that originally sold me on Bayesianism. If you have any interest in this subject at all, grab this one while it's available.


<< 1 >>

© 2004, ReviewFocus or its affiliates