Ullman’s first book, coauthored with John Hopcroft and based on a course developed by Hopcroft at Princeton in the mid-1960s. As a discipline, computer science was built upon a body of theory and techniques drawn from existing subfields within mathematics, engineering, and logic. Books like this were vital in integrating these formerly scattered areas of knowledge, extending them in ways relevant to the study of computation, and presenting them in a coherent way to the first generation of students educated as computer scientists.
This book is considered a classic in the field and was one of the most cited books in computer science research for more than a decade. It became the standard textbook for algorithms courses throughout the world when computer science was still an emerging field. In addition to incorporating their own research contributions to algorithms, The Design and Analysis of Computer Algorithms introduced the random access machine (RAM) as the basic model for analyzing the time and space complexity of computer algorithms using recurrence relations. The RAM model also codified disparate individual algorithms into general design methods. It, and the general algorithm design techniques introduced in this book, are now integral parts of the standard computer science curriculum.
Integrated formal language theory and syntax-directed translation techniques into the compiler design process. Often called the “Dragon Book” because of its cover design, it lucidly lays out the phases in translating a high-level programming language to machine code, modularizing the entire enterprise of compiler construction. It includes algorithmic contributions that the authors made to efficient techniques for lexical analysis, syntax analysis techniques, and code generation. The current edition of this book, Compilers: Principles, Techniques and Tools (co-authored with Ravi Sethi and Monica Lam), was published in 2007 and remains the standard textbook on compiler design.
This book, which has been revised several times, trained generations of computer scientists in data models, database design, and the use of database management systems. It revolutionized the content of database courses at all levels, moving database design from a purely engineering discipline to one with a firm theoretical foundation.
Compiles material based on the courses taught by Ullman and his colleagues at the Stanford InfoLab.
Explores the manipulation of web data and included discussion of cloud-based techniques for parallel programming. Based on courses taught at Stanford. The second (2014) and third (2020) editions include Jure Leskovec as an author. Available free for download from a link at http://www.mmds.org/.