Home :: Books :: Computers & Internet  

Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet

Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical
Reference
Religion & Spirituality
Romance
Science
Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
Perceptrons - Expanded Edition: An Introduction to Computational Geometry

Perceptrons - Expanded Edition: An Introduction to Computational Geometry

List Price: $35.00
Your Price: $35.00
Product Info Reviews

<< 1 >>

Rating: 5 stars
Summary: Deja vu?
Review: In 1958, Cornell psychologist Frank Rosenblatt proposed the 'perceptron', one of the first neural networks to become widely known. A retina sensory layer projected to an association layer made up of threshold logic units which in turn connected to the third layer, the response layer. If two groups of patterns are linearly separable then the perceptron network works well in learning to classify them in separate classes. In this reference, Minsky and Papert show that assuming a diameter-limited sensory retina, a perceptron network could not always compute connectedness, ie, determining if a line figure is one connected line or two separate lines. Extrapolating the conclusions of this reference to other sorts of neural networks was a big setback to the field at the time of this reference. However, it was subsequently shown that having an additional 'hidden' layer in the neural network overcame many of the limitations. This reference figures so prominently in the field of neural networks, and is often referred to in modern works. But of even greater significance, the history of the perceptron demonstrates the complexity of analyzing neural networks. Before this reference, artificial neural networks were considered terrific, after this reference limited, and then in the 1980s terrific again. But at the time of this writing, it is realized that despite physiological plausibility, artificial neural networks do not scale well to large or complex problems that brains can easily handle, and artificial neural networks as we know them may actually be not so terrific.

Rating: 5 stars
Summary: Seminal AI book
Review: This is a seminal work in the field of Artificial Intelligence. Following an initial period of enthusiasm, the field encountered a period of frustration and disrepute. Minksy and Papert's 1969 book summed up this general feeling of frustration among researchers by demonstrating the representational limitations of Perceptrons (used in neural networks). Their arguments were very influential in the field and accepted by most without further analysis.

I found this book to be generally easy to read. Despite being written in 1969, it is still very timely.


<< 1 >>

© 2004, ReviewFocus or its affiliates