Rating: Summary: A Fun Read. Review: An entertaining book that will be enjoyed by anyone interested in mathematical logic or computation theory. Davis weaves history, anecdote, and mathematics into an exciting sketch of the major developments in mathematical logic and their role in the development of the computer. He does a commendable job in explaining the mathematics in an accessible fashion, without distorting it by over-simplification. A good book for people new to the field as well as those already familiar with these stories.
Rating: Summary: Thoroughly enjoyable Review: Anyone who wants to understand computers should read this book. It describes the intellectual pre-histroy of the computer, and blends biographic vignettes in with the technical information. You'll learn alot, and enjoy it too!
Rating: Summary: A history of the underlying mathematical concepts Review: As a recent college graduate, who earned a B.S. in computer science, I thought this book provided some good background information on the people who worked to discover the underlying principles of automated mathematics implemented in a machine. The book was, for the most part, not terribly difficult to follow and gave more insight on the actual history of the individual people and times than I thought it might. Nevertheless, the individual histories, and time context put the points being made into a better framework. Not a long book, I recommend this to the more intellectual type, rather than an occasional reader.
Rating: Summary: An Excellent Overview Review: I thought that this book was an excellent overview of the development of logical thought and it's relevance to the modern computer. Davis does a superior job of energizing a subject that is admittedly a little dull. I found myself rereading several of the sections to try to better understand some of the math involved, but overall, I think Davis found a nice balance between the complexity of the math and the history of logic. My one serious criticism of the book is that I found the chronology to be tough to follow, and I often found myself referring back to previous chapters to try and get a better sense of when events were happening. It is natural to assume that a book like this is presented in chronological fashion. The Universal Computer generally is presented that way, but there are some events that happen more or less simultaneously. This is important to the overview of the history of the field. I think the book could actually use a graphical timeline with the birth dates of the mathematicians and the significant events (i.e. 1902 - Russell's letter to Frege, etc.) that are involved. Other than that, the book is informative and enjoyable for those interested in the origins of the modern computer.
Rating: Summary: Thanks for this book, Professor Davis Review: In 1972, during the conflict over "who invented the computer" I published a letter in ComputerWorld asking why Turing's contributions were being ignored. However, a letter from a long-haired kid working as a programmer in a small university isn't "knowledge" properly understood, and therefore I was gratified to get belated confirmation from the eminent Martin Davis.Davis shows that Turing anticipated computation in general with his 1936 paper "On the Decision Problem." But in addition, Turing's work in codebreaking during the war and his postwar work on the Pilot Ace showed a connection between the abstract Turing machine and the computer. Turing's work on the Enigma machine was still under official secrecy in 1972, but I'd seen the Pilot Ace in a collection of papers on computer architecture edited by Bell and Newell. It still may be open whether Turing invented the computer. But more important is that a reading of Turing shows that inside computing, there are Two Cultures: "engineering" versus "logic." In 1972, when I wrote my letter, the predominant computing culture was that of von Neumann. This was the "engineering" culture. In this culture software is subordinated as the "inferior term" of literary theory (using deconstruction, without apology): hardware is "male" and software is "female". Von Neumann was unamused by early programmers who suggested that the computer itself could transliterate mathematical formulae into code. Von Neumann felt that the computer was too important for such a trivial task. The early computers were presented as Serious Iron..."male." Most managers in 1972 felt that programming was at best a marginal annoyance. But in his work on the Pilot Ace, Turing foresaw the need to take programming seriously. To make Serious the marginalized serious is the ultimate marginal gesture, which is why American "literary theorists" seem to prefer recreational pursuits; Jacques Derrida asked an American graduate student to examine technology and she was offended. It is safer and more respectable to be Serious about Jane Austen. Like an Austen heroine, who would think of Mr. Napoleon as at best most inconvenient, literary theory seems uninterested in software, yet software provides a rich vein, in my view, of the interaction of the marginal and the Serious. The late, distinguished computer scientist Edsger Dijkstra in the 1950s in the Netherlands informed his mentors that he wanted to be, not a mathematician or physicist, but a programmer, which dismayed hiss mentors. They probably said "you can't be serious." Like Glenn Gould, Edward Said, and other intellectuals of the immediate postwar, Dijkstra's negative dialectic seems to have damaged his career somewhat. Glenn Gould refused to perform after about 1960 with the result that many musical professionals regard Gould, today, as an irritant. Edward Said refused to reconcile himself to exile and in some circles is scorned. Dijkstra's choices resulted, I think, in some lack of later recognition: his Turing award was awarded by insiders while his work was of Nobel quality. Turing's contributions as a result of his refusal to conform lifestyle-wise obscured his accomplishment. The "software" part of the Two Cultures of computing is partly to blame for its consistent failure, even today, to be heard. This is because the software Eloi refuse to join world intellectual traditions and disempower themselves. One example occurs when the software guy must step outside the box into the realm of what Marx would call his relations with his own kind, where "all that is solid melts into air." Davis finds Hegel incomprehensible. But it's possible that there are forms of life external to software in which Hegel makes sense. Hegel identified complete "self identity" with nothingness. A software guy's expression would be "a brain in a vat", coding using pure logic innocent of the dialectic and subsisting on pizza, provided by an employer (who lays our boy off a year later, showing that there is no escape from Hegel's world of lordship and bondage after all.) The predominance of hardware culture has brutalized several generations of programmers. The hardware view is the source of the ugly phrase "a simple matter of programming" which encapsulates the irrational subordination of the disfavored term. In consequence, programmers are held to rigid deadlines in which they have no say. In consequence, many software efforts including using the computer to build one's own needed tools have become termination offenses. This is a sexual politics. The irrational "business" necessity has become the elimination of any need to be conscious of logic as marginal, and the implication is that we can, if need be, do without the marginal term. But examination of the hardware and its representation either as explicit "firmware" (programming instructions implementing the hardware in read-only memory) or the result of a process in which the hardware guys had to think like programmers after all. It is a sexual politics because the irrational demand takes dignity away from the marginal people in the name of a hypostatized rationality in a vat. In part, I identify with the frustration of the engineering manager. I have heard the phrase "a line in the sand" more than once in the 1990s where it originated in the senior Bush's campaign to liberate Kuwait (or something.) The engineering manager, unaware of what Dijkstra called the cruelty of programming considered as applied logic and math, feels that the intellectual content of programming is nil and therefore appeals for more time constitute slacking off. He needs to set boundaries. The engineering manager feels like Yogi Berra: "I would like to go back to college and study but I would not study music because I already like music." Precisely because of logic's universality (Wittgenstein's "allumfassende, weltspiegelnde Logik", all-embracing Logic, which mirrors the world) it becomes for the engineering male a form of driving. We all think we know how to drive because of the universal necessity of driving and it is the other bozo who doesn't know how to drive. Martin Davis has given programming respectability by linking it to an academic tradition. This is a start.
Rating: Summary: Thanks for this book, Professor Davis Review: In 1972, during the conflict over "who invented the computer" I published a letter in ComputerWorld asking why Turing's contributions were being ignored. However, a letter from a long-haired kid working as a programmer in a small university isn't "knowledge" properly understood, and therefore I was gratified to get belated confirmation from the eminent Martin Davis. Davis shows that Turing anticipated computation in general with his 1936 paper "On the Decision Problem." But in addition, Turing's work in codebreaking during the war and his postwar work on the Pilot Ace showed a connection between the abstract Turing machine and the computer. Turing's work on the Enigma machine was still under official secrecy in 1972, but I'd seen the Pilot Ace in a collection of papers on computer architecture edited by Bell and Newell. It still may be open whether Turing invented the computer. But more important is that a reading of Turing shows that inside computing, there are Two Cultures: "engineering" versus "logic." In 1972, when I wrote my letter, the predominant computing culture was that of von Neumann. This was the "engineering" culture. In this culture software is subordinated as the "inferior term" of literary theory (using deconstruction, without apology): hardware is "male" and software is "female". Von Neumann was unamused by early programmers who suggested that the computer itself could transliterate mathematical formulae into code. Von Neumann felt that the computer was too important for such a trivial task. The early computers were presented as Serious Iron..."male." Most managers in 1972 felt that programming was at best a marginal annoyance. But in his work on the Pilot Ace, Turing foresaw the need to take programming seriously. To make Serious the marginalized serious is the ultimate marginal gesture, which is why American "literary theorists" seem to prefer recreational pursuits; Jacques Derrida asked an American graduate student to examine technology and she was offended. It is safer and more respectable to be Serious about Jane Austen. Like an Austen heroine, who would think of Mr. Napoleon as at best most inconvenient, literary theory seems uninterested in software, yet software provides a rich vein, in my view, of the interaction of the marginal and the Serious. The late, distinguished computer scientist Edsger Dijkstra in the 1950s in the Netherlands informed his mentors that he wanted to be, not a mathematician or physicist, but a programmer, which dismayed hiss mentors. They probably said "you can't be serious." Like Glenn Gould, Edward Said, and other intellectuals of the immediate postwar, Dijkstra's negative dialectic seems to have damaged his career somewhat. Glenn Gould refused to perform after about 1960 with the result that many musical professionals regard Gould, today, as an irritant. Edward Said refused to reconcile himself to exile and in some circles is scorned. Dijkstra's choices resulted, I think, in some lack of later recognition: his Turing award was awarded by insiders while his work was of Nobel quality. Turing's contributions as a result of his refusal to conform lifestyle-wise obscured his accomplishment. The "software" part of the Two Cultures of computing is partly to blame for its consistent failure, even today, to be heard. This is because the software Eloi refuse to join world intellectual traditions and disempower themselves. One example occurs when the software guy must step outside the box into the realm of what Marx would call his relations with his own kind, where "all that is solid melts into air." Davis finds Hegel incomprehensible. But it's possible that there are forms of life external to software in which Hegel makes sense. Hegel identified complete "self identity" with nothingness. A software guy's expression would be "a brain in a vat", coding using pure logic innocent of the dialectic and subsisting on pizza, provided by an employer (who lays our boy off a year later, showing that there is no escape from Hegel's world of lordship and bondage after all.) The predominance of hardware culture has brutalized several generations of programmers. The hardware view is the source of the ugly phrase "a simple matter of programming" which encapsulates the irrational subordination of the disfavored term. In consequence, programmers are held to rigid deadlines in which they have no say. In consequence, many software efforts including using the computer to build one's own needed tools have become termination offenses. This is a sexual politics. The irrational "business" necessity has become the elimination of any need to be conscious of logic as marginal, and the implication is that we can, if need be, do without the marginal term. But examination of the hardware and its representation either as explicit "firmware" (programming instructions implementing the hardware in read-only memory) or the result of a process in which the hardware guys had to think like programmers after all. It is a sexual politics because the irrational demand takes dignity away from the marginal people in the name of a hypostatized rationality in a vat. In part, I identify with the frustration of the engineering manager. I have heard the phrase "a line in the sand" more than once in the 1990s where it originated in the senior Bush's campaign to liberate Kuwait (or something.) The engineering manager, unaware of what Dijkstra called the cruelty of programming considered as applied logic and math, feels that the intellectual content of programming is nil and therefore appeals for more time constitute slacking off. He needs to set boundaries. The engineering manager feels like Yogi Berra: "I would like to go back to college and study but I would not study music because I already like music." Precisely because of logic's universality (Wittgenstein's "allumfassende, weltspiegelnde Logik", all-embracing Logic, which mirrors the world) it becomes for the engineering male a form of driving. We all think we know how to drive because of the universal necessity of driving and it is the other bozo who doesn't know how to drive. Martin Davis has given programming respectability by linking it to an academic tradition. This is a start.
Rating: Summary: Logical roots of computers Review: This book traces the contributions of mathematical logicians to the development of modern day computers. Its cast of characters begins with Gottfried Leibnitz in the 17th century, continues with George Boole in the 19th century, Gottlob Frege and David Hilbert straddling the 19th and 20th centuries, and ends with Kurt Goedel, Alan Turing and John von Neumann in the 20th century. The author brings these great scientists to life by describing their works in the context of their lives and times. He shows that despite their exceptional intellects, they often had difficult obstacles to overcome, both in their own frailties as well as in their adversaries. The book's main theme is that although modern computers were born out of the need to do heavy number crunching during WWII, their foundation is in logic, the very logic by which our own brains work. It tells a compelling story of how the quest for understanding of the very foundations of mathematics led to the development of the machines that we have come to depend on so heavily in our daily lives. There are a few places where the reading becomes a bit difficult as the author outlines the work of Goedel and Turing in the early part of the 20th century. Nevertheless, this book is quite readable overall and very enjoyable (as soon as I finished reading it the first time, I immediately started reading it again). I recommend it to general reader who would like to know more about the theoretical underpinnings of computers. The only comment I have is that all of the mathematicians covered were from Germany, England and the United States. I was left wondering if there might be contributors from other countries that were overlooked.
Rating: Summary: Logical roots of computers Review: This book traces the contributions of mathematical logicians to the development of modern day computers. Its cast of characters begins with Gottfried Leibnitz in the 17th century, continues with George Boole in the 19th century, Gottlob Frege and David Hilbert straddling the 19th and 20th centuries, and ends with Kurt Goedel, Alan Turing and John von Neumann in the 20th century. The author brings these great scientists to life by describing their works in the context of their lives and times. He shows that despite their exceptional intellects, they often had difficult obstacles to overcome, both in their own frailties as well as in their adversaries. The book's main theme is that although modern computers were born out of the need to do heavy number crunching during WWII, their foundation is in logic, the very logic by which our own brains work. It tells a compelling story of how the quest for understanding of the very foundations of mathematics led to the development of the machines that we have come to depend on so heavily in our daily lives. There are a few places where the reading becomes a bit difficult as the author outlines the work of Goedel and Turing in the early part of the 20th century. Nevertheless, this book is quite readable overall and very enjoyable (as soon as I finished reading it the first time, I immediately started reading it again). I recommend it to general reader who would like to know more about the theoretical underpinnings of computers. The only comment I have is that all of the mathematicians covered were from Germany, England and the United States. I was left wondering if there might be contributors from other countries that were overlooked.
Rating: Summary: magnificent; will become a classic Review: This is one of the best popular books on computer science or mathematics in years. Most authors in this area (e.g., Berlinski) have no special expertise in the subject matter or its history; that doesn't guarantee a bad book, but makes it hard to write a good one. Davis is a refreshing exception: * He is a brilliant researcher, who made fundamental contributions to areas such as computability (the Davis-Putnam- Robinson theorem, related to Hilbert's 10th problem) and algorithms (the Davis-Putnam algorithm for solving satisfiability problems). * He is a master expositor (his 1958 book "Computability and Unsolvability" was one of the very first textbooks in its area, yet it is still widely read today despite the many other books written on this subject over the past 42 years). * He has spent the last twenty years studying the history of logic and computation. Davis's book is all one would hope for given his qualifications. It is insightful and engaging, and full of fascinating information that is hard to find elsewhere. I cannot imagine a better book on this subject.
Rating: Summary: What a compelling book! Review: This popular treatment of the development of computing turned out to be a book that I simply couldn't put down. Martin Davis interlaces the lives of the people who laid the groundwork for computing (and what interesting lives they led!) with a very understandable treatment of the technical side of the underpinnings of computing. I've heartily recommended this to my friends--technical minded and not--as book I think they really want to read.
|