<< 1 >>
Rating: Summary: Rich & Valuable Review: This book aims at rigorours and deep treatment of statistical learning and is divided into three parts :(I)THEORY OF LEARNING AND GENERALIZATION; (II)SUPPORT VECTOR ESTIMATION OF FUNCTIONS; (III)STATISTICAL FOUNDATION OF LEARNING THEORY' For anyone intending to dive into this topic intriguing readers shull find their task rather not simple when exploring this mathematical exposition.This is because of the mature nature behind the basic theory .In order to gain most of the benefit ,interested and even involved researchers are urged and should assume all the requirements for a vast and solid mathematical background. I Think the book constitutes a respectful and organized 'exhibition' that you will not find in any other place. Althought there are excellent books discussing SVMs and Machine-Learning/ Intelligence,eventually all emenate from the theory.Regarding the book rating it is was not rated upon how much you retrieve as concepts, but how well the propositions offer a precious appreciation of the substantial theory.In otherwords, this book is not the place for a first time learning, but it is serves as a bridge between interrelated elements of such incredibly growing area. For the book: "The Nature of Statistical learning Theory" also by Vapnik you can find a review by Vladimir Cherkassky in The IEEE TRANSACTIONS ON NEURAL NETWORKS VOL. 8, NO. 6, NOVEMBER 1997 .
Rating: Summary: new approach to inference based on VC dimension Review: Vapnik and Chernovenkis extended the Glivenko-Cantelli Theorem in their work on classification and statistical learning. Vapnik in recent texts has described a form of nonparametric statistical inference based on approximating functions and the Vapnik-Chernovenkis dimension. In an earlier book published by Springer-Verlag he develops the basics of the theory. However to keep the mathematical level excessible to computer scientists and engineers he avoided the mathematical proofs needed for mathematical rigor. This text is an advanced text that provides the rigorous development. Although the preface and chapter 0 give the reader a idea of what is to come the rest of the text is difficult reading. The theory has been quite successful at attacking the pattern recognition/ classification problem and provides a basis for understanding support vector machines. However Vapnik sees a much broader application to statistical inference in general when the classical parametric approach fails. If you have a strong background in probability theory you should be able to wade through the book and get something out of it. If not I recommend reading section 7.9 of "The Elements of Statistical Learning" by Hastie, Tibshirani and Friedman. That will give you an easily understandable view of the VC dimension. Also sections 12.2 and 12.3 of their text will give you some appreciation for support vector machines and the error rate bounds obtainable for them based on the VC dimension.
<< 1 >>
|