Rating: Summary: COMPUTATIONS OF THE MIND Review: Noting his background in computer science, some classify Eric Baum among those who believe that ``our souls are software'', but this is not quite fair. Although he states that ``the obvious inability of present-day computer science to account for [the brain's behavior] is no reason at all for doubting that they can be accounted for by computer science,'' the intellectual perspectives of What is Thought? are broader than this assertion seems to suggest. The book begins with several interesting chapters on the nature of computation (I particularly liked the presentation of the traveling-salesman problem), which include discussions of the importance of making decisions at the level of semantics, the Turing test, properties of neural nets, hill climbing in a fitness landscape, among several other relevant topics. These discussions lead into the author's central thesis that the mind, like all efficient computer programs, is necessarily modular. In other words, each aspect of the brain's dynamics comprises several subroutines, which presumably can be further broken down into hierarchical structures of nested activities, and he discusses several permutations of this important concept. Curiously, Baum's otherwise comprehensive list of references does not include Donald Hebb's seminal and classic work, in which the notion of ``cell assemblies'' (which are dynamically self-sufficient modules of neurons) was first suggested over a half-century ago. As a psychologist, Hebb aimed to ``bridge the long gap between the facts of psychology and those of neurology,'' and coming at about the same time as the development of the digital computer, his formulation has provided the basis for many numerical studies starting in the 1950s and continuing to the present day which are in accord with a growing body of electrophysiological data. Setting this quibble aside, Baum offers compelling psychological evidence for the modular structure of mind and provides his readers with an interesting and informative account of how the structure of our thinking may have developed over the course of biological evolution, with particular attention paid to computational constraints on the development of learning mechanisms. Importantly, his perspectives are broader than those of many of his colleagues, as he asserts that the ``whole program'' of a brain's dynamics includes the ``complex society'' in which it is embedded. Indeed, the author's evident humility in the face of awesome intricacy of mental activity is, to me, one of the more appealing aspects of What is Thought? The often suggested possibilities for quantum computation are discussed in some detail, along with an analysis of the widely noted example of ``Schrödinger's cat'' which was originally proposed to emphasize the difficulties of applying ideas developed for atomic dynamics to complex macroscopic systems. Considering that a quantum computer---if it is at all possible to construct one---must be carefully isolated from structural irregularities and operated near absolute zero of temperature, Baum joins the majority of physical scientists in concluding that it is ``highly unlikely that quantum computation is relevant to the mind.'' Eric Baum has a dog, and---like most of us dog owners---he is convinced that his pet is conscious, but he goes on to assert that ``we do not need to posit new qualitative modes of thinking to explain human advance over animals. To my mind, the difference between human intelligence and animal intelligence is straightforwardly explainable by cumulative progress once there is the ability to communicate programs.'' Here, again, Baum could profit from reading Hebb's book, which contains but a single mathematical expression, namely A/S. This parameter represents the ratio of the associative area (A) of a mammalian neocortex to its sensory area (S), and it becomes greater as one progresses from rats through dogs to humans. Of course, these relative differences may be examples of the ``cumulative progress'' to which Baum refers. Alwyn Scott http://personal.riverusers.com/~rover/
Rating: Summary: Thought deconstructed... Review: Rapid progress is now being made in the field of neuroscience, and this progress is not merely in theory, but also in laboratory measurements, thanks to the advances in magnetic resonance imaging. Also, advanced and practical applications in artificial intelligence are now a reality. Indeed, applications of artificial intelligence in the business environment are skyrocketing, and there is every indication that this will continue. Still though, the nature of human thought remains somewhat of a mystery, which is a kind of irony, given that intelligence is imputed to humans even without understanding fully what is really going on in the human mind when it is engaged in problem solving, reasoning, planning, or myriads of other activities. We do have non-human intelligent machines, but they are not considered to be by most, with the sole reason being that we can understand the nature of their problem-solving abilities. Will we then continue to view the human mind as exhibiting intelligence once we have deciphered its workings? This book gives many different insights into the problem of human thinking and just what are its origins. Although written for the "popular" audience, much can be gained from reading it regardless of the reader's background. It does indulge in speculation frequently and reasoning by analogy, and it skirts at the ill-defined boundaries of philosophy, but it is worth taking the time to read in detail. The author has a very specific view of intelligence as is readily apparent when he remarks that human intelligence has the ability to "understand" in many domains. Machines in his view though do not have this ability, but are "brittle", and cannot tackle different problems on the fly the way humans can. But does "understanding", as we frequently impute it to humans, have to accompany successful problem solving? Why is it that we are prejudiced in the requirement of "understanding" when we characterize an entity as intelligent? And is the ability to answer questions from many domains or contexts, however vaguely they are presented, really indicative of intelligence or understanding? The author wants to clarify the notion of "understanding", this to be one of the goals in the book. He asks whether there is some "quantity called understanding" that will serve to distinguish mechanical computation from thought. The author is expressing great insight in bringing this question to light, as it has long been a prejudice that machines are merely engaging in syntactical manipulation, and unable to deal with the "semantics" or understanding of the "meaning" behind the symbols. The thesis of the author is very straightforward, namely that Occam's razor, as he defines it, serves as the foundation for human reasoning and the mind. The criterion of simplicity is formulated using Kolmogorov complexity, which is currently the most popular one, at least in the computer science community. Most interesting though is the author's view on compression, in that the human mind functions by using essentially compressed programs. Compression to him is the key to understanding, in fact is equal to it. Compressed descriptions are the origin of understanding, and the human brain has, through evolution and reinforcement learning, acquired very adept programs for a myriad of tasks relevant for human survival. The author's arguments are interesting, but he frequently uses arguments by analogy rather than backing them up with empirical research. More use must be made of the research in neuroscience and psychology before claims can be made on the functioning of the human brain. Too much philosophical discussion has invaded the author's arguments, and this weakens his case in many places in the book. It is the opinion of this reviewer that those engaged in research into artificial intelligence, neuroscience, and closely related fields should declare a moratorium on philosophical speculation and argumentation. The conceptual spaces generated by philosophical speculation are too large to be practical, as they contain too much information, which is constantly expanding with time because of the lack of side constraints, or "inductive bias." Since the efficacy of the human brain is the result of evolutionary pressures, one would naturally ask what role the genetic code would play. The author answers this question in very simple terms, namely that the generation of mind was due to the training of a compact program. This compact program was encoded as DNA, and evolution was a training process for this program. It took four billion years of this training to be compactified into an expression residing in the DNA. This viewpoint is an interesting one, and it sounds very plausible, but again, it still needs to be supported with empirical evidence. Much use in the book is made of results from computational learning theory because of the author's belief that inductive bias is the crucial to learning in complex environments. `Inductive bias', as he views it, and how it is viewed by researchers in computational learning theory, is a certain preference in learning one concept rather than another. Certainly it is true, and it has been shown by research in computational learning theory, that inductive bias is useful in pruning the search space and can assist in omitting useless information that is not pertinent to the problem at hand. However, the author's view is much stronger regarding the role of inductive bias: he is claiming that it is absolutely essential for learning in complex environments and therefore that other approaches to learning in such environments will not be as efficacious. His views are thus at odds with certain results in computational learning theory regarding the absence of a "free lunch" in randomized algorithms (as reinforcement learning is). The author is claiming, perhaps without meaning to, that learning algorithms that incorporate inductive bias will give essentially a free lunch. The only way out of this difficulty might be to acknowledge that the learning processes used by the brain are still being subjected to evolutionary pressure and hence that the learning processes now being used are not optimized, and are undergoing modification (however slowly).
Rating: Summary: A thoughtful book about thought Review: The question of how we humans developed our ability to think has facinated humankind for millenia. In this book, Baum presents a synthesis of recent developments in several branches of science (genetics, artificial intelligence, etc) along with his own unique insights. This is a facinating look into the workings of the brain and how the way we think may be "programmed" by our DNA. The book is very readable by the educated layperson.
Rating: Summary: The profound made simple Review: The review published in Nature does a better job than I could,
so I'll excerpt it.
"'What is thought?' is not a new question. For Aristotle, thought was what the soul does, and for Descartes it was the unequivocal evidence of one's existence. For Eric Baum, a US expert in machine learning, it is a computer program. This is not a superficial assertion: Baum pursues the idea with elegance, clarity, and considerable pursuasion...
"It is important not to treat the idea that thought is a program in too superficial a fashion. Popular texts often include explanations such as `brain is like hardware and mind is like software.' Baum intends a level of sophistication far above this. Thought for him is the process that 'understands' the complexities of the world...
"Baum's central point is that it is possible for programs to evolve, adapt, and learn, making them more powerful than anything that a programmer can concoct...
"Baum gives a reasoned response to John Searle's claim that no program can 'understand' the world, and to Roger Penrose's contention that conscious insight lies outside the logic that can be achieved by computation...
"this is a splendid book for discovering what is new. It will enthrall some computer scientists and provoke some philosophers. And it should engage general readers who wish to enjoy a clear, understandable description of many advanced principles of computer science."
Rating: Summary: This book is very poor Review: This book is full of little more than hackneyed and vague slogans such as "the mind is highly evolved", "the simplest theories are the best", "the mind is modular", "the program in your mind maintains a compact description of the world", "the mind is programmed by the DNA", etc. It's one of the most annoying books I have come across in quite a while, especially given the incomprehensibly positive reviews printed on its cover and the self-promotion of its author.
Rating: Summary: What Is What Is Thought? Review: Those who are not yet convinced that the brain is a computing mechanism, or who believe that mysticism is required to explain thought, will find quite a bit of value in this book. The book surveys numerous areas of Computer Science, AI, and even a bit of biology, in an attempt to build a case for the brain as a computing mechanism. The book also wades into evolution to try to explain how it came to be so. The scope of the book is ambitious. Anyone with a background in AI or Cognitive Science will likely find "What is Thought" disappointing as it has little new to say. I fall into this category, and I find a number of aspects of this book unsatisfying. This is a long book in which there is a short book struggling to get out. The author's main thesis, that the brain is a modular computing mechanism that is the result of evolution, is repeated numerous times at considerable length to the point of tedium. While the author shows his thesis to be consistent with numerous observations, it is never developed to any greater depth. In fact, one of the author's conclusions is that we may never understand the inner workings of the brains "subroutines" because, as a result of evolution, they are now so "compressed". The author rarely defines his terms. Merely replacing the words "compressed" and "compact" by the word "concise" would enhance the clarity of this book considerably. The author also seems to be of the opinion that generalization, which is the result of "compressed" representations, is the essence of understanding. This view is inadequate for explaining our abilities to plan our own actions and predict the actions of other agents, for example. Because of the informal, breezy style, the book comes across as an introduction for novices or a position paper rather than a scholarly work. While some may enjoy this style, I find it lacks a certain satisfying clarity and crispness needed for a convincing presentation of such an abstract topic.
Rating: Summary: What Is What Is Thought? Review: Those who are not yet convinced that the brain is a computing mechanism, or who believe that mysticism is required to explain thought, will find quite a bit of value in this book. The book surveys numerous areas of Computer Science, AI, and even a bit of biology, in an attempt to build a case for the brain as a computing mechanism. The book also wades into evolution to try to explain how it came to be so. The scope of the book is ambitious. Anyone with a background in AI or Cognitive Science will likely find "What is Thought" disappointing as it has little new to say. I fall into this category, and I find a number of aspects of this book unsatisfying. This is a long book in which there is a short book struggling to get out. The author's main thesis, that the brain is a modular computing mechanism that is the result of evolution, is repeated numerous times at considerable length to the point of tedium. While the author shows his thesis to be consistent with numerous observations, it is never developed to any greater depth. In fact, one of the author's conclusions is that we may never understand the inner workings of the brains "subroutines" because, as a result of evolution, they are now so "compressed". The author rarely defines his terms. Merely replacing the words "compressed" and "compact" by the word "concise" would enhance the clarity of this book considerably. The author also seems to be of the opinion that generalization, which is the result of "compressed" representations, is the essence of understanding. This view is inadequate for explaining our abilities to plan our own actions and predict the actions of other agents, for example. Because of the informal, breezy style, the book comes across as an introduction for novices or a position paper rather than a scholarly work. While some may enjoy this style, I find it lacks a certain satisfying clarity and crispness needed for a convincing presentation of such an abstract topic.
Rating: Summary: Is Evolution The Secret To Intelligence? Review: Why can humans rapidly carry out tasks, such as learning to talk or recognizing an object, that seem intractable for computers? According to Eric Baum, the human brain is much like a computer, but it runs programs that are different from the ones usually written by human computer programmers. The programs run by the brain are insightful or ``compressed''; they have built in a good deal of knowledge or ``understanding'' about the nature of the world. Human programmers have difficulty generating such efficient or compressed programs (except for limited special purposes), because to do so requires vast computing resources, far beyond what one can accomplish with pencil and paper or even with presently available computer assistance. The key to understanding intelligence, according to Baum, is the theory of evolution; in the process that brought humans into being, evolution cycled through many billions of generations of organisms, in the course of which, in effect, vast computational resources were brought to bear on the problem of generating useful algorithms. The real secret to thought is thus stored in our DNA, which preprograms us with algorithms that are more efficient and powerful than the ones usually available to computer scientists. With this starting point, Baum proposes answers to many old riddles. Our sense of ``self'' reflects our origin in an evolutionary struggle for survival toward which all components of our biology are directed. ``Free will'' is a useful approximation because of the great complexity of our brains (and our limited knowledge about them) and the concommitant difficulty of predicting a person's behavior. Baum illustrates his arguments with numerous examples drawn from biology, psychology, and computer science; the material is generally quite interesting, though at times perhaps too detailed for a casual reader. His arguments are surprisingly persuasive, and, while certainly no expert, I suspect that Baum is closer to the mark than most of the old and new classic writers on these problems.
|