Rating: Summary: Excellent Review: Advances in compiler design do not get much press these days. The reasons for this are unclear, but no doubt the perception that compilers need no further improvement has something to do with this. This book, written by one of the leading experts on compilers, certainly dispels this belief. Once readers get used to the idiosyncratic ICAN (Informal Compiler Algorithm Notation) invented by the author and used throughout the book, they get a comprehensive overview of compilers, especially that of optimization. Compilers for the SPARC, PowerPC, DEC, and Pentium architectures are treated in the book. The predominant emphasis of the book is in optimization, and so a few more recent and important topics in compiler construction, such as partial evaluation, are not discussed. Readers are expected to have a prior background in elementary compiler theory. My primary interest in reading the book was to gain insight into the compilation issues that arise in symbolic programming languages such as LISP and Prolog. A detailed review of this book cannot be done for lack of space, but some of the helpful aspects and interesting discussions in the book include: 1. The "wrap-up" section at the end of each chapter, giving a compact summary of what was done in the chapter. 2. Generating loads and stores: The author shows how to move values to and from registers using routines more sophisticated than simply loading values into registers before using them or storing values as soon as they have been computed. 3. The main issues in the use of registers, such as variable allocation, efficiency of procedural calls, and scoping. The author lists the different categories that will result in contention for registers, such as stack, frame, and global offset table pointers and dynamic and static links. 4. The local stack frame and its uses, such as holding indexed variables (arrays, etc.) and debugging. 5. The five different parameter-passing mechanisms: call by value, call by result, call by value-result, call by reference, and call by name. A thorough discussion is given of their properties and what languages make use of them. In particular, the author notes that in the languages C and C++, call by value is the only parameter-passing mechanism, but that the address of an object may be passed, thus emulating essentially call by reference. This can be a source of confusion to those who program in C and C++. The most exotic of these mechanisms is call by name, which is a form of "lazy evaluation" in functional programming languages. The author gives a code example of the call by name parameter passing in ALGOL 60. I don't know of any modern practical programming languages that make use of call by name. 6. Shared libraries and the role of semantic linking and position independent code. 7. The compilation issues that arise in symbolic languages, such as LISP and Prolog. These languages typically have run-time type checking and function polymorphism, which gives them their power and ease of use. The author discusses how to produce efficient code for these languages. Since heap storage is utilized heavily by these languages, the allocation and recovering of it is very important. "Generation scavenging" is mentioned as the most efficient method for doing garbage collection in these languages. This method has been advertised in the literature as one that minimizes the time needed for storage reclamation in comparison with other approaches. In addition, the use of "on-the-fly" recompilation for polymorphic-language implementations is discussed. 8. Dynamic programming and its role in automatic production of code generators, as contrasted with the "greedy approach". The author explains the need for "uniform register machines" in the dynamic programming algorithm. 9. Interval analysis and its use in the analysis of control flow. This technique has been used in the field called "abstract interpretation" in recent years, the aim of which is too automatically and intelligently test program code. 10. Dependencies between dynamically allocated objects, such as links between graph structures in LISP and Prolog. The author describes the Hummel-Hendren-Nicolau technique for doing this, which involves naming schemes for locations in heap memory, a collection of axioms for characterizing aliasing locations among locations, and lastly, and most interestingly, utilizes a theorem prover to establish the properties of the data structures. The author emphasizes though that this technique, and others developed for doing dependence analysis of dynamically allocated objects, are very computationally intensive. 11. Individual optimizations, which the author divides into four groups in order of importance. 12. Induction-variable optimizations and their role in loop optimizations. The author shows how to identify induction variables, and how to transform them using various techniques, going by the name strength reduction, induction-variable removal, and linear-function test replacement. 13. Procedure integration and its role in "inlining" procedures in languages such as C++. The author emphasizes the drawbacks in using inlining, such as its impact on cache misses. 14. The trade-off between object abstraction and optimization, which occurs in object-oriented languages such as C++. The author discusses in detail the role of interprodecural optimizations in dealing with abstraction in the object-oriented modular approach to programming, particularly the identification of "side effects" in making procedure calls. 15. Code optimization that takes advantage of the memory hierarchy, such as data and instruction caches, and how to improve register allocation for arrays. The author gives a detailed and highly interesting discussion of scalar replacement for array elements. 16. Future trends and research in compiler design. The author mentions a few which he believes will dominate in the upcoming decade, such as scalar-oriented and data-cache optimizations. Scalar compilation will be he most active research area in his opinion. At the present time, there has been discussion of "intelligent compilers" that will interact with the user to develop optimal code, or even produce correct programs. These compilers will understand the intentions of the program and warn the user if these are violated, as well as reduce the time and cost needed for testing programs.
Rating: Summary: A must have book for professional compiler design, Review: Having read the dragon book, i was looking around for a book which can give me more information on subjects like code generation & code optimization. This book is exactly that. 90% of the book deals with the code generation & optimization techniques used by the commercial compilers available today. The algorithms are complete and can be implemented without any difficulty. You won't find any book which deals with the code generation & optimization techniques so detailed as this one. This book is certainly not intended for a beginner. This book is for the professionals.
Rating: Summary: A great book on advanced compiler design Review: I have been working on language processors, interpreters and compilers for almost twenty years. I try to order all the books that have something unique to say about compiler design and implementation. This is one of the best books I have seen on advanced compiler design. I have owned it since it was first published. Going back and rereading it I am reminded of what and excellent book it is, which is what motivated this review. Advanced compiler design deals with various forms of optimization, including local, global and loop optimization. This is a complex topic with thirty years of research behind it (it is interesting to note that the late Gary Kildall, of CP/M fame, did some early work on optiimization in the early 1970s). No single book can provide complete coverage of all optimization issues. However, this book, along with Allen and Kennedy's equally excellent "Optimizing Compilers for Modern Architectures" covers almost everything you need to know. One of the problems with the academic literature on compiler optimization is that it can be unnecessarily obscure. Muchnick writes clearly, with the implementer in mind. He provides a wide range of techniques, allowing the implementer to choose the correct one for a given compiler. This approach is both useful and necessary: there is no single method for building a compiler, given the range of languages and design objectives. Muchnick covers everything you need to know about local and global scalar optimization, including scalar optimization in loops and optimization for modern processor architecture. The only thing missing is an indepth coverage of loop dependence and optimization techniques, which is provided by Allen and Kennedy. If you are working on the design, implementation or extension of a modern compiler, this book should be part of your library.
Rating: Summary: A great book on advanced compiler design Review: I have been working on language processors, interpreters and compilers for almost twenty years. I try to order all the books that have something unique to say about compiler design and implementation. This is one of the best books I have seen on advanced compiler design. I have owned it since it was first published. Going back and rereading it I am reminded of what and excellent book it is, which is what motivated this review. Advanced compiler design deals with various forms of optimization, including local, global and loop optimization. This is a complex topic with thirty years of research behind it (it is interesting to note that the late Gary Kildall, of CP/M fame, did some early work on optiimization in the early 1970s). No single book can provide complete coverage of all optimization issues. However, this book, along with Allen and Kennedy's equally excellent "Optimizing Compilers for Modern Architectures" covers almost everything you need to know. One of the problems with the academic literature on compiler optimization is that it can be unnecessarily obscure. Muchnick writes clearly, with the implementer in mind. He provides a wide range of techniques, allowing the implementer to choose the correct one for a given compiler. This approach is both useful and necessary: there is no single method for building a compiler, given the range of languages and design objectives. Muchnick covers everything you need to know about local and global scalar optimization, including scalar optimization in loops and optimization for modern processor architecture. The only thing missing is an indepth coverage of loop dependence and optimization techniques, which is provided by Allen and Kennedy. If you are working on the design, implementation or extension of a modern compiler, this book should be part of your library.
Rating: Summary: This is no dragon book Review: I was excited when I first scanned through a friends copy of this book because of its treatment of modern compiler design issues. I ran out, ordered the book, read through it more carefully and realized that it was nothing more than a collection of papers. There would be nothing wrong with this if it were so advertised. The book lacks original analysis. I wouldn't stedfastly say DON'T by this, but, I would say you might be better off just going to your local college CS library and grabbing all the relevant papers referened in the book.
Rating: Summary: Excellent coverage Review: I would suggest to the average reader to first get "Programming Language Pragmatics" by michael l. scott and coming to this book. Reason being that "Programming Language Pragmatics" would provide ground-coverage before moving on to this book. happy reading!
Rating: Summary: Must-Have CS book Review: If you are a good CSE engineer, and wanna make even better, this book is the one should be on your shelf. If you are compiler engineer, it is a must-have. I agree that it is a collection of research papers, but it is the only one comprehensive collection covering all aspects of compilation. I personally find it very helpful. Dragon book is cool, but it is only for beginners. If you are a beginner, always start with Dragon book, but don't abuse this classic book.
Rating: Summary: A Very Good Starting Point Into Compiler Theory from 1997 Review: My employer owns the book. It is a constant on my shelf. The dragon book is great but there are concepts I did not understand in the classic dragon book until the Muchnick book spelt it out for me. Also at the end of the book it covers real products and how they implement optimization techniques.
Rating: Summary: Good for seasoned compiler writers, bad for CS students Review: Ok, let's be fair. This book provides a broad coverage of useful optimizations and it will be useful in case you work writing compilers AND have some experience.
However, for learning the concepts, it is a very bad material. At the end you end up confused under a pile of thousands of lines of pseudocode in a weird notation (invented by the author) called "I CAN" (yes you can write a very bad book Mr. Muchnick) instead of reading useful explanations of the topics. The author also assumes that you already know some concepts and that's why he does not explain them as he should. If you want to really understand this book, first review Chapter 10 of the Dragon Book. I thought that the Dragon book was not so good because you have to re read some things in order to fully understand, but with Muchnick's book that is not always possible.
You can also take a look at Morgan's book (unfortunately, out of print) or just read the papers (as the first reviewer suggested). This book is not enough, and sadly, a lot of "teachers" think of it as a kind of "bible" and as a very bad excuse to teach very poorly. Some of them even don't master all the concepts presented there and have to use other books (their "dirty little secret") but they don't tell you which ones and continue praising this bad piece of work. If you are a CS student who really wants to learn, be warned that this book is not for you (it has at least three erratas and still has errors!)
Rating: Summary: Great Back-end Book Review: The book does its job and does it well. But, it's only fair to warn that the book concentrates on code generation, optimization, instruction scheduling, etc. If you're looking for attribute grammars, etc., look somewhere else. When you're done looking for that stuff, get this book.
|