<< 1 >>
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: A synthesis, not an introduction Review: After reading the reviews I was really looking forward to reading this book, but ended up a bit disappointed. The editorial review and introduction lead me to believe that there was difficult material in the book, but I would be able to make my way through most of it.I came at the book with a computer science background (and no prior neural network experience) and found the material rather difficult to follow. The statistics and math needed to really follow the book was more than I expected. This doesn't mean the book is bad. After skimming through it a couple of times I really believe that the other reviewers are right -- this is a great resource on neural networks. However, just be sure you have the appropriate background to really get the most out of it. If you are looking for an introductory book on neural nets or are a little rusty on your statistics and math I would recommend looking elsewhere.
Rating: ![3 stars](http://www.reviewfocus.com/images/stars-3-0.gif) Summary: I ended up returning it... Review: After reading the reviews I was really looking forward to reading this book, but ended up a bit disappointed. The editorial review and introduction lead me to believe that there was difficult material in the book, but I would be able to make my way through most of it. I came at the book with a computer science background (and no prior neural network experience) and found the material rather difficult to follow. The statistics and math needed to really follow the book was more than I expected. This doesn't mean the book is bad. After skimming through it a couple of times I really believe that the other reviewers are right -- this is a great resource on neural networks. However, just be sure you have the appropriate background to really get the most out of it. If you are looking for an introductory book on neural nets or are a little rusty on your statistics and math I would recommend looking elsewhere.
Rating: ![2 stars](http://www.reviewfocus.com/images/stars-2-0.gif) Summary: Didn't get anything out of it. Review: After sitting down several times and attempting to learn something from Ripley's Pattern Recognition book I am frustrated each time. I wish Ripley could be a better author. From his writings you can see he knows a lot about Neural Networks, but cannot relate it to the reader. The text is VERY heavy in mathematical formulas (about 1/3 page of pure math per page). Another third of the book are references to other papers (There are 35 pages of references. Ripley cites about 1000 different papers.). That doesn't leave a lot left over for the reader. I suggest this book only for those already familiar enough with Neural Nets to have their papers cited by Ripley. One thing did surprise me. There is one page with color! Describing clustering (I think). I almost died laughing. Showed it to other stat friends familiar with Ripley and we chuckled.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: The inner workings of a neural net Review: I concur with the other reviewers. This book requires the reader to be proficient in many different disciplines. It is extremely difficult to digest if you lack an in-depth background in statistics (Bayes theory etc.), calculus and advanced algebra. Many sections of this book were used as a part of Ripley's graduate courses taught at Cambridge where is still a professor of applied statistics. Where I part company with many of the reviewers is that I will not penalize this book for going over my head at times. It is intended for graduate students in statistics or computer science. The neural network section explains the workings of NNs that are typically hidden to users of NNs in software packages. In some cases a click of a button is all that is needed to do what is explained in considerable depth in this tome. It can be very useful to fully understand what it is that has happened when a program switch is altered. This prevents using a NN and receiving a naive result that makes unfounded predictions.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: advanced and important work Review: If you want a nice up-to-date treatment on neural networks and statistical pattern recognition with lots of nice pictures and an elementary treatment, I recommend the new edition of Duda and Hart. However, neural networks were basically started by the computer-science / artificial intelligence community using analogies to the human nervous system and the perceived connections to the human thought processes. These connections and arguments are weak. However, a statistical theory of nonlinear classification algorithms shows that these methods have nice properties and have mathematical justification. The statistical pattern recognition research is well over 30 years old and is very well established. So these connections are important for putting neural networks on firm ground and providing greater acceptability from the statistical as well as the engineering community. Ripley provides a theoretical threatment of the state-of-the-art in statistical pattern recognition. His treatment is thorough, covering all the important developments. He provides a large bibliography and a nice glossary of terms in the back of the book. Recent papers on neural networks and data mining are often quick to generate results but not very good at providing useful validation techniques that show that perceived performance is not just an artifact of overfitting a model. This is an area where statisticians play a very important role, as they are keenly aware through their experience with regression modeling and prediction, of the crucial need for cross-validation. Ripley covers this quite clearly in Section 2.6 titled "How complex a model do we need?" It is nice to see the thoroughness of this work. For example, in error rate estimation, many know of the advances of Lachenbruch and Mickey on error rate estimation in discriminant analysis and the further advances of Efron and others with the bootstrap. But in between there was also significant progress by Glick on smooth estimators. This work has been overlooked by many statisticians probably because some of it appears in the engineering literature (but one important paper was in the Journal of the American Statistical Association [JASA] in 1972). To some extent this oversight may be due to the fact that it was not mentioned in Efron's famous 1983 JASA paper and hence is usually missed in the bootstrap literature. Bootstrap methods and cross-validation play a prominent role in this text. This is an excellent reference book for anyone seriously interested in pattern recognition research. For applied and theoretical statisticians who want a good account of the theory behind neural networks it is a must.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: advanced and important work Review: If you want a nice up-to-date treatment on neural networks and statistical pattern recognition with lots of nice pictures and an elementary treatment, I recommend the new edition of Duda and Hart. However, neural networks were basically started by the computer-science / artificial intelligence community using analogies to the human nervous system and the perceived connections to the human thought processes. These connections and arguments are weak. However, a statistical theory of nonlinear classification algorithms shows that these methods have nice properties and have mathematical justification. The statistical pattern recognition research is well over 30 years old and is very well established. So these connections are important for putting neural networks on firm ground and providing greater acceptability from the statistical as well as the engineering community. Ripley provides a theoretical threatment of the state-of-the-art in statistical pattern recognition. His treatment is thorough, covering all the important developments. He provides a large bibliography and a nice glossary of terms in the back of the book. Recent papers on neural networks and data mining are often quick to generate results but not very good at providing useful validation techniques that show that perceived performance is not just an artifact of overfitting a model. This is an area where statisticians play a very important role, as they are keenly aware through their experience with regression modeling and prediction, of the crucial need for cross-validation. Ripley covers this quite clearly in Section 2.6 titled "How complex a model do we need?" It is nice to see the thoroughness of this work. For example, in error rate estimation, many know of the advances of Lachenbruch and Mickey on error rate estimation in discriminant analysis and the further advances of Efron and others with the bootstrap. But in between there was also significant progress by Glick on smooth estimators. This work has been overlooked by many statisticians probably because some of it appears in the engineering literature (but one important paper was in the Journal of the American Statistical Association [JASA] in 1972). To some extent this oversight may be due to the fact that it was not mentioned in Efron's famous 1983 JASA paper and hence is usually missed in the bootstrap literature. Bootstrap methods and cross-validation play a prominent role in this text. This is an excellent reference book for anyone seriously interested in pattern recognition research. For applied and theoretical statisticians who want a good account of the theory behind neural networks it is a must.
Rating: ![4 stars](http://www.reviewfocus.com/images/stars-4-0.gif) Summary: not for the faint at heart, but such a pleasure to read Review: Let me start by saying that this book assumes a lot of background, especially in statistics. It dives into the math right away without even a hint or a gentle slope. But what I appreciate is that math is never used for its own sake, it is always justified. The book starts with the introduction to the problems neural nets are to be applied to - pattern recognition task. It proceeds to the elements of statistical decision theory, then goes up to linear discriminant analysis and perceptrons, then up you go to feed-forward neural nets. Non-parametric models and tree-based classifiers are covered next. Belief networks and unsupervised methods (MDS, clustering, etc..) follow. Coverage is extensive, although I would like to see more in the areas of unsupervised learning, which is quite foundational to the whole business. What sells me on this book quite frankly is that is always keeps an eye on a real-world example. No model or algorithm is introduced without a real-world problem it was intended to solve. You would be better served by the Bishop book (Neural Networks for Pattern Recognition, by C.Bishop ISBN:0198538642) if you are looking for a quick introduction. I would say Ripley's book is the perfect second book on the subject. I must aplaud the editors and designers of the book. A book itself, apart from the material it covers, is an aestetically most pleasent creation for the somewhat dry subject. Its use of margins is a piece of art - margins are wide, accessible, important points are highlighted there, and you can get to the needed point by flipping the pages quickly. The quality of paper is very good, the book opens wells, and holds its form very well. If you take it seriously and use it often, these qualities will gain in importance.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: A definite must have Review: Neural Networks and the range of other techniques to come out of AI have never been given the statistical treatment that they need. Brian Ripley has created a book for a statistician such as myself that is informative and complete. It is interesting to see that many of the more traditional multivariate analysis techniques can be at least as good and better then some of the new AI versions. This book is a must have if NN and aligned technologies are to advance having some statistical measure of efficiency is essential
Rating: ![4 stars](http://www.reviewfocus.com/images/stars-4-0.gif) Summary: This book is a Rosetta stone into neural networks Review: This text has an extensive development of Neural networks
from a strong statistical basis. For anyone wanting a
quick way to access the broad spectrum of literature covering
neural networks and find the seminal papers, thoughts, developments
of the field, the literature references are worth the price.
This is essentially a literature survey, and not a "how-to"
book. It is not excessively heavy on the mathematics but
the uses verbiage to enhance the math that is necessary for
such a topic. It handles a number of significant but often
overlooked issues, such as the need for an ordering scheme of the partial
derivatives in backpropagation. Most authors don't address the obscure
but important points that will make or break your work if you
aren't aware of them. Ripley makes the reader cognizant of where the minefields lie. This book is a Rosetta stone into
neural networks.
Rating: ![5 stars](http://www.reviewfocus.com/images/stars-5-0.gif) Summary: A synthesis, not an introduction Review: This text is wonderful. As some have pointed out, it might not be the best introduction to statistical pattern recognition and classification. Not every text should strive to be introductory, however, and this work shines for other reasons. The true strength of the book is its synthesis of material from diverse domains in a single text. My experience has been in the realm of statistics, and it was insightful to find that neural network approaches share much with traditional classification and discrimination techniques. I find the book enlighting not so much because it explains a given technique in great detail, but because it explains how a number of techniques are related and differ from one another. In this sense, it has opened up a whole new world of approaches to problems I encounter, that I had previously deemed inapplicable because they were "AI engineering techniques" or some such thing. If you want to learn about the details of a particular approach to pattern recognition--e.g., ICA, kohonen maps, SVM, etc.--find a different text. If you want an overview of the field of pattern recognition, however, buy this text. It provides a comprehensive, integrative perspective on classical and modern techniques from a number of disciplines. In fact, I would recommend this text as a complement to a more detailed text on a given pattern recognition technique--the one will fill in the details Ripley necessarily skims, and Ripley will explain how the technique is related to everything else.
<< 1 >>
|