• go to Avi Wigderson's profile page
  • go to Leonard M. Adleman's profile page
  • go to Edsger W. Dijkstra's profile page
  • go to Vinton Cerf's profile page
  • go to A J Milner 's profile page
  • go to Richard W. Hamming's profile page
  • go to Edgar F. Codd's profile page
  • go to Jeffrey D Ullman's profile page
  • go to Niklaus E. Wirth's profile page
  • go to Whitfield Diffie 's profile page
  • go to Peter Naur's profile page
  • go to Andrew C Yao's profile page
  • go to Edmund Clarke's profile page
  • go to Kenneth E. Iverson 's profile page
  • go to Leslie Lamport's profile page
  • go to Juris Hartmanis's profile page
  • go to Charles W Bachman's profile page
  • go to Ronald L Rivest's profile page
  • go to Douglas Engelbart's profile page
  • go to Butler W Lampson's profile page
  • go to Martin Hellman 's profile page
  • go to Amir Pnueli's profile page
  • go to Dennis M. Ritchie 's profile page
  • go to Maurice V. Wilkes's profile page

Alfred Vaino Aho DL Author Profile link

United States – 2020
Short Annotated Bibliography
  • A.V. Aho & J.D. Ullman. The Theory of Parsing, Translation, and Compiling. Englewood Cliffs, NJ: Prentice Hall, 1973.

A two volume summary of Aho and Ullman’s research on computer language theory and algorithms. The book helped to establish a solid body of theory, grounded in the analysis of automata, for the creation of compilers and other language processing tools.

  • A.V. Aho, J.E. Hopcroft and J.D. Ullman. The Design and Analysis of Computer Algorithms, Reading MA: Addison-Wesley, 1974.

This book is considered a classic in the field and was one of the most cited books in computer science research for more than a decade. It became the standard textbook for algorithms courses throughout the world when computer science was still an emerging field. In addition to incorporating their own research contributions to algorithms, The Design and Analysis of Computer Algorithms introduced the random access machine (RAM) as the basic model for analyzing the time and space complexity of computer algorithms using recurrence relations. The RAM model also codified disparate individual algorithms into general design methods. The RAM model and general algorithm design techniques introduced in this book now form an integral part of the standard computer science curriculum.

  • A.V. Aho & J.D. Ullman. Principles of Compiler Design, Reading MA: Addison-Wesley, 1977.

Integrated formal language theory and syntax-directed translation techniques into the compiler design process. Often called the “Dragon Book” because of its cover design, it lucidly lays out the phases in translating a high-level programming language to machine code, modularizing the entire enterprise of compiler construction. It includes algorithmic contributions that the authors made to efficient techniques for lexical analysis, syntax analysis techniques, and code generation. The current edition of this book, Compilers: Principles, Techniques and Tools (co-authored with Ravi Sethi and Monica Lam), was published in 2007 and remains the standard textbook on compiler design.

  • A.V. Aho, J.E. Hopcroft & J.D. Ullman. Data Structures and Algorithm, Reading MA: Addison-Wesley, 1983.

A very widely used and cited textbook intended for a first course on algorithms. It expanded and updated the introductory material from their earlier book on algorithms, with examples using the Pascal language and abstract data types.

  • A.V. Aho, B.W. Kernigan & P.J. Weinberger. The AWK Programming Language, Addison-Wesley, 1987.

Written by the three creators of AWK, this book described the programming language used in the Bell Labs tool. It came to serve as a de facto standard for the language, which has been reimplemented in many other programs. This was one of a series of Unix books written by Bell Labs researchers, which supported the widespread adoption of Unix, C, and the Unix tools during the 1980s.