Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet
Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical
Reference
Religion & Spirituality
Romance
Science
Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
|
 |
Parallel Distributed Processing: Explorations in the Microstructure of Cognition : Foundations (Parallel Distributed Processing) |
List Price: $100.00
Your Price: $100.00 |
 |
|
|
Product Info |
Reviews |
Description:
This two-volume work is now considered a classic in the field. It presents the results of the Parallel Distributed Processing (PDP) group's work in the early 1980s and provides a good overview of the earlier neural network research. The PDP approach (also known as connectionism among other things) is based on the conviction that various aspects of cognitive activity are thought of in terms of massively parallel processing. The first volume starts with the general framework and continues with an analysis of learning mechanisms and various mathematical and computational tools important in the analysis of neural networks. The chapter on backpropagation is written by Rumelhart, Hinton, and Williams, who codiscovered the algorithm in 1986. The second volume is written with a psychological and biological emphasis. It explores the relationship of PDP to various aspects of human cognition. The book is a comprehensive research survey of its time and most of the book's results and methods are still at the foundation of the neural network field.
|
|
|
|