Home :: Books :: Science  

Arts & Photography
Audio CDs
Audiocassettes
Biographies & Memoirs
Business & Investing
Children's Books
Christianity
Comics & Graphic Novels
Computers & Internet
Cooking, Food & Wine
Entertainment
Gay & Lesbian
Health, Mind & Body
History
Home & Garden
Horror
Literature & Fiction
Mystery & Thrillers
Nonfiction
Outdoors & Nature
Parenting & Families
Professional & Technical
Reference
Religion & Spirituality
Romance
Science

Science Fiction & Fantasy
Sports
Teens
Travel
Women's Fiction
A History of Modern Computing : Second Edition

A History of Modern Computing : Second Edition

List Price: $25.00
Your Price: $16.50
Product Info Reviews

<< 1 2 >>

Rating: 1 stars
Summary: This book has an essential flaw
Review: Any history of modern computing that starts with ENIAC is flawed. The first working, fully programmable general purpose computer was Konrad Zuse's Z3 (Germany, 1941). ENIAC (inspired by Atanasoff's earlier, less general designs) was fully programmable too, but came much later (in 1946). Z3's switches were based on relays instead of tubes like in ENIAC. This is no fundamental difference. There are many ways of implementing a switch. Today we use transistors, of course.

Rating: 0 stars
Summary: My attempt to make sense of the past 50 years of computing.
Review: Hello. I'm Paul Ceruzzi, the author of this book, and I am glad that you are interested in it. Let me tell you how I came to write this book. I am a curator at the Smithsonian in Washington, DC, and when I told people that I studied the history of computing, again and again I was asked if there was a single volume that could describe its history for the non specialist. After repeatedly answering them, "Sorry, no," I decided to remedy that deficiency, and here is the result. I don't cover the years before 1945--other books do that pretty well--but I do cover the main avenues of computing up to about 1995. Of course, computing has come a long way since 1995--you're reading this on Amazon, aren't you?--but that will have to wait for a second edition. I hope you enjoy the story.

Rating: 4 stars
Summary: 4 STARS for Ceruzzi
Review: Here in America we say that books are Eurocentric. We have no name for a phenomenon that is useful to our cultural life, and "American-centric" is therefore my term of art for books that narrate culture and technology as if no interesting developments happen beyond our shores. The consequences of this ignorance, as we have seen, can be deadly, for one of the reasons for non-Western extremism is our instinct to treat non-Western participation in our culture and technology with disdain.

Thus, as Ceruzzi fails to narrate, Algol is really the only common ancestor of usable programming languages, yet Ceruzzi dismisses Algol because it was not a commercial success. Algol was not a commercial success because IBM failed to support it in the decade from 1954 to 1964, and then attempted to usurp it with vaporware PL/I, for which IBM's programmers failed to develop an adequate compiler until the mid-1970s. Nonetheless, the block structure of Algol was found to be the only rational way of thinking about program structure as opposed to Fortran.

But Ceruzzi not only naturalizes American technical praxis along the dimensions of geography, he also naturalizes it along a temporal axis in which the mainframe era was a failed try at modern praxis.

Thus the "colorful" Herb Grosch does get his picture in Ceruzzi's book...and with his goatee poor Herb looks slightly fraudulent.

Grosch's law was so obviously self-serving from the standpoint of Herb's employer IBM; it was that the larger the computer, the power delivered increases exponentially. Herb left IBM in the late 1960s, and the history of how men like Herb were compromised (by the occlusion of their feelings and thoughts with corporate goals) is unwritten.

Herb's law was falsified by the discovery in the late 1960s that large computers (such as MIT's Multics) required such complex software that their promise could not be delivered, and today's law is Moore's law, which declares that microchip power will instead exponentially increase as the micros get smaller.

Common to both "laws" is the naturalizing error of neoclassical economics, which acts as if history does not exist. While it does appear today that Moore's law is still true as chip designs deliver what is miscalled computer "power" (the "power" to deliver wrong answers at high speed should be deconstructed) and is actually mere clock speed at an exponentially increasing rate, an historical perspective should remind us that this too, shall pass.

Making smaller chips is a labor process which has damaged the water-table of places like Silicon Valley and which represents the personal choices of venture capitalists to fund, entrepreneurs to entrep, and employees to choose to work in moon suits that are damned itchy at the end of the day.

Moore's law, like so many "laws" of neoclassical economics, declares that in 1971 we stumbled upon a fact of nature, like Parson Malthus observing the lads cavorting with milkmaids. It is secretly normative (like so many laws of the dismal science) in that it commands us to conform to this fact of nature as a ticket to adulthood.

Perhaps "computers are takin' over." But a critical history of technology, which to me is the only study worthy of the name of history, would read against the grain. It would narrate world praxis in hardware and in software as did a 1999 IEEE Transactions (in the History of Computers) which showed how the Swedes got by in the 1960s without IBM mainframes. It would narrate victim history, including the very interesting history of computer programmers who, it seems, have been an invisible class because they represent, all the way down, a counter-narrative to the dominant narrative of an autonomous technology to which we have to conform (for example, the biography of computer pioneer Ted Nelson is more interesting than that of John von Neumann.)

A very useful result of such a history would be applied, retro computing, for while mainstream historians like Ceruzzi are laying the past to rest, libraries, universities and other institutions are losing data through losing the software that formats and reads older data files. The XML (eXtended Markup Language) notation tries to address this problem as did Ted Nelson's Xanadu system but technical innovations, useful as they are, by definition do not address existing Lotus 1-2-3 spreadsheets (or the moldering Algol compiler I discovered at Princeton.)

I look to a book and software system on CD-ROM that would preserve, not the physical realization of outdated systems like the IBM 7090 or TRS-80, but their important features, which was the "architecture" they presented to their actual programmers. While building a retro computer encyclopaedia would be a formidable task, it would be made easier by describing the architectural interface of the computer in a form that a modern system can "compile" to a program that simulates the old computer, thereby presenting the user of the encyclopaedia with actual running examples of old software.

To modern-day crowds, trooping through the Smithsonian, computers are physical objects. But actual programmers know that computers are ideas in the mind, and a retro encyclopaedia would be a fascinating narrative of how Turing's idea created the postmodern era. It would also make clear that the old fraud, Marx, was right, for the value computers has created for society consists in a deep labor of understanding architectures enough to craft problem instructions, including the most despised yet most valuable instruction: "computer, here is a language in which I shall speak, and here is how you shall translate that language."

This is a grand yet critical narrative, for it shows that Leibniz was wrong. Let us not calculate (sir) let us communicate. I probably expect too much of poor Mr Ceruzzi, who appears to be of the tribe of people with which I made acquaintance at Princeton; the humanists who honestly apply their narrative skills to technology. But it appears that in America, no-one has answered Derrida's 1978 call for a critical reading of technology.

Rating: 1 stars
Summary: USA-centric and flawed
Review: Here in America we say that books are Eurocentric. We have no name for a phenomenon that is useful to our cultural life, and "American-centric" is therefore my term of art for books that narrate culture and technology as if no interesting developments happen beyond our shores. The consequences of this ignorance, as we have seen, can be deadly, for one of the reasons for non-Western extremism is our instinct to treat non-Western participation in our culture and technology with disdain.

Thus, as Ceruzzi fails to narrate, Algol is really the only common ancestor of usable programming languages, yet Ceruzzi dismisses Algol because it was not a commercial success. Algol was not a commercial success because IBM failed to support it in the decade from 1954 to 1964, and then attempted to usurp it with vaporware PL/I, for which IBM's programmers failed to develop an adequate compiler until the mid-1970s. Nonetheless, the block structure of Algol was found to be the only rational way of thinking about program structure as opposed to Fortran.

But Ceruzzi not only naturalizes American technical praxis along the dimensions of geography, he also naturalizes it along a temporal axis in which the mainframe era was a failed try at modern praxis.

Thus the "colorful" Herb Grosch does get his picture in Ceruzzi's book...and with his goatee poor Herb looks slightly fraudulent.

Grosch's law was so obviously self-serving from the standpoint of Herb's employer IBM; it was that the larger the computer, the power delivered increases exponentially. Herb left IBM in the late 1960s, and the history of how men like Herb were compromised (by the occlusion of their feelings and thoughts with corporate goals) is unwritten.

Herb's law was falsified by the discovery in the late 1960s that large computers (such as MIT's Multics) required such complex software that their promise could not be delivered, and today's law is Moore's law, which declares that microchip power will instead exponentially increase as the micros get smaller.

Common to both "laws" is the naturalizing error of neoclassical economics, which acts as if history does not exist. While it does appear today that Moore's law is still true as chip designs deliver what is miscalled computer "power" (the "power" to deliver wrong answers at high speed should be deconstructed) and is actually mere clock speed at an exponentially increasing rate, an historical perspective should remind us that this too, shall pass.

Making smaller chips is a labor process which has damaged the water-table of places like Silicon Valley and which represents the personal choices of venture capitalists to fund, entrepreneurs to entrep, and employees to choose to work in moon suits that are damned itchy at the end of the day.

Moore's law, like so many "laws" of neoclassical economics, declares that in 1971 we stumbled upon a fact of nature, like Parson Malthus observing the lads cavorting with milkmaids. It is secretly normative (like so many laws of the dismal science) in that it commands us to conform to this fact of nature as a ticket to adulthood.

Perhaps "computers are takin' over." But a critical history of technology, which to me is the only study worthy of the name of history, would read against the grain. It would narrate world praxis in hardware and in software as did a 1999 IEEE Transactions (in the History of Computers) which showed how the Swedes got by in the 1960s without IBM mainframes. It would narrate victim history, including the very interesting history of computer programmers who, it seems, have been an invisible class because they represent, all the way down, a counter-narrative to the dominant narrative of an autonomous technology to which we have to conform (for example, the biography of computer pioneer Ted Nelson is more interesting than that of John von Neumann.)

A very useful result of such a history would be applied, retro computing, for while mainstream historians like Ceruzzi are laying the past to rest, libraries, universities and other institutions are losing data through losing the software that formats and reads older data files. The XML (eXtended Markup Language) notation tries to address this problem as did Ted Nelson's Xanadu system but technical innovations, useful as they are, by definition do not address existing Lotus 1-2-3 spreadsheets (or the moldering Algol compiler I discovered at Princeton.)

I look to a book and software system on CD-ROM that would preserve, not the physical realization of outdated systems like the IBM 7090 or TRS-80, but their important features, which was the "architecture" they presented to their actual programmers. While building a retro computer encyclopaedia would be a formidable task, it would be made easier by describing the architectural interface of the computer in a form that a modern system can "compile" to a program that simulates the old computer, thereby presenting the user of the encyclopaedia with actual running examples of old software.

To modern-day crowds, trooping through the Smithsonian, computers are physical objects. But actual programmers know that computers are ideas in the mind, and a retro encyclopaedia would be a fascinating narrative of how Turing's idea created the postmodern era. It would also make clear that the old fraud, Marx, was right, for the value computers has created for society consists in a deep labor of understanding architectures enough to craft problem instructions, including the most despised yet most valuable instruction: "computer, here is a language in which I shall speak, and here is how you shall translate that language."

This is a grand yet critical narrative, for it shows that Leibniz was wrong. Let us not calculate (sir) let us communicate. I probably expect too much of poor Mr Ceruzzi, who appears to be of the tribe of people with which I made acquaintance at Princeton; the humanists who honestly apply their narrative skills to technology. But it appears that in America, no-one has answered Derrida's 1978 call for a critical reading of technology.

Rating: 1 stars
Summary: USA-centric and flawed
Review: Here in America we say that books are Eurocentric. We have no name for a phenomenon that is useful to our cultural life, and "American-centric" is therefore my term of art for books that narrate culture and technology as if no interesting developments happen beyond our shores. The consequences of this ignorance, as we have seen, can be deadly, for one of the reasons for non-Western extremism is our instinct to treat non-Western participation in our culture and technology with disdain.

Thus, as Ceruzzi fails to narrate, Algol is really the only common ancestor of usable programming languages, yet Ceruzzi dismisses Algol because it was not a commercial success. Algol was not a commercial success because IBM failed to support it in the decade from 1954 to 1964, and then attempted to usurp it with vaporware PL/I, for which IBM's programmers failed to develop an adequate compiler until the mid-1970s. Nonetheless, the block structure of Algol was found to be the only rational way of thinking about program structure as opposed to Fortran.

But Ceruzzi not only naturalizes American technical praxis along the dimensions of geography, he also naturalizes it along a temporal axis in which the mainframe era was a failed try at modern praxis.

Thus the "colorful" Herb Grosch does get his picture in Ceruzzi's book...and with his goatee poor Herb looks slightly fraudulent.

Grosch's law was so obviously self-serving from the standpoint of Herb's employer IBM; it was that the larger the computer, the power delivered increases exponentially. Herb left IBM in the late 1960s, and the history of how men like Herb were compromised (by the occlusion of their feelings and thoughts with corporate goals) is unwritten.

Herb's law was falsified by the discovery in the late 1960s that large computers (such as MIT's Multics) required such complex software that their promise could not be delivered, and today's law is Moore's law, which declares that microchip power will instead exponentially increase as the micros get smaller.

Common to both "laws" is the naturalizing error of neoclassical economics, which acts as if history does not exist. While it does appear today that Moore's law is still true as chip designs deliver what is miscalled computer "power" (the "power" to deliver wrong answers at high speed should be deconstructed) and is actually mere clock speed at an exponentially increasing rate, an historical perspective should remind us that this too, shall pass.

Making smaller chips is a labor process which has damaged the water-table of places like Silicon Valley and which represents the personal choices of venture capitalists to fund, entrepreneurs to entrep, and employees to choose to work in moon suits that are damned itchy at the end of the day.

Moore's law, like so many "laws" of neoclassical economics, declares that in 1971 we stumbled upon a fact of nature, like Parson Malthus observing the lads cavorting with milkmaids. It is secretly normative (like so many laws of the dismal science) in that it commands us to conform to this fact of nature as a ticket to adulthood.

Perhaps "computers are takin' over." But a critical history of technology, which to me is the only study worthy of the name of history, would read against the grain. It would narrate world praxis in hardware and in software as did a 1999 IEEE Transactions (in the History of Computers) which showed how the Swedes got by in the 1960s without IBM mainframes. It would narrate victim history, including the very interesting history of computer programmers who, it seems, have been an invisible class because they represent, all the way down, a counter-narrative to the dominant narrative of an autonomous technology to which we have to conform (for example, the biography of computer pioneer Ted Nelson is more interesting than that of John von Neumann.)

A very useful result of such a history would be applied, retro computing, for while mainstream historians like Ceruzzi are laying the past to rest, libraries, universities and other institutions are losing data through losing the software that formats and reads older data files. The XML (eXtended Markup Language) notation tries to address this problem as did Ted Nelson's Xanadu system but technical innovations, useful as they are, by definition do not address existing Lotus 1-2-3 spreadsheets (or the moldering Algol compiler I discovered at Princeton.)

I look to a book and software system on CD-ROM that would preserve, not the physical realization of outdated systems like the IBM 7090 or TRS-80, but their important features, which was the "architecture" they presented to their actual programmers. While building a retro computer encyclopaedia would be a formidable task, it would be made easier by describing the architectural interface of the computer in a form that a modern system can "compile" to a program that simulates the old computer, thereby presenting the user of the encyclopaedia with actual running examples of old software.

To modern-day crowds, trooping through the Smithsonian, computers are physical objects. But actual programmers know that computers are ideas in the mind, and a retro encyclopaedia would be a fascinating narrative of how Turing's idea created the postmodern era. It would also make clear that the old fraud, Marx, was right, for the value computers has created for society consists in a deep labor of understanding architectures enough to craft problem instructions, including the most despised yet most valuable instruction: "computer, here is a language in which I shall speak, and here is how you shall translate that language."

This is a grand yet critical narrative, for it shows that Leibniz was wrong. Let us not calculate (sir) let us communicate. I probably expect too much of poor Mr Ceruzzi, who appears to be of the tribe of people with which I made acquaintance at Princeton; the humanists who honestly apply their narrative skills to technology. But it appears that in America, no-one has answered Derrida's 1978 call for a critical reading of technology.

Rating: 4 stars
Summary: Insightful!
Review: Paul E. Ceruzzi, curator of the National Air and Space Museum, describes the development of computing, starting with its earliest history. He examines the beginnings of commercial computing from 1945 to 1956 and traces the history of computer hardware and software, dividing these developments into five- to 10-year time periods. His book emphasizes technical development, rather than personalities or business dynamics, a focus that contributes to its fairly dry, academic style. With this caveat, we [...] recommend the book primarily to those with a technological bent, such as professionals in operations and computer sciences, and academics in the field. However, if you are interested in the subject, you'll love this. Ceruzzi provides an informative and comprehensive saga including extensive footnotes and a bibliography that runs about 80 pages.


Rating: 4 stars
Summary: Good history, but buy Rheingold's book
Review: This book is a history of computing technology since 1945. Ceruzzi focuses mostly on hardware, giving very detailed descriptions of how the great mainframes were built and by whom. He also discusses how transistors and microchips were developed and came into use in computers, although his descriptions of the development of computer languages, operating systems, or other software are much briefer than those he provides for hardware. The book has a number of illustrations of people and the machines they made famous. The author seems very careful to give an accurate account of events and the book is very well footnoted. It also includes an excellent index and bibliography. I would recommend this book to anyone interested in the technical aspects of the history of computers.

Rating: 5 stars
Summary: Technical details
Review: This book is a history of computing technology since 1945. Ceruzzi focuses mostly on hardware, giving very detailed descriptions of how the great mainframes were built and by whom. He also discusses how transistors and microchips were developed and came into use in computers, although his descriptions of the development of computer languages, operating systems, or other software are much briefer than those he provides for hardware. The book has a number of illustrations of people and the machines they made famous. The author seems very careful to give an accurate account of events and the book is very well footnoted. It also includes an excellent index and bibliography. I would recommend this book to anyone interested in the technical aspects of the history of computers.

Rating: 4 stars
Summary: Good history, but buy Rheingold's book
Review: This book makes a nice thorough reference of the history of computing post 1945, and is great for use on a course. However it is a little dry and unanalytical. Buy Howard Rheingold's "Tools for thought" if you want a generally more human, enjoyable read which provides almost as good a technical account.

Rating: 4 stars
Summary: an excellent book on the history of main-line hardware
Review: Though (as in everything) there are lacunae, Ceruzzi has put together a singularly good book which should be a must for anyone interested in the history of computing. While the author does not seem to be interested in software nor in operating systems, his recounting of the machine developments is excellent.


<< 1 2 >>

© 2004, ReviewFocus or its affiliates