• go to Joseph Sifakis's profile page
  • go to Geoffrey E Hinton's profile page
  • go to Dennis M. Ritchie 's profile page
  • go to Niklaus E. Wirth's profile page
  • go to Ole-Johan Dahl 's profile page
  • go to Yoshua Bengio's profile page
  • go to Marvin Minsky 's profile page
  • go to A J Milner 's profile page
  • go to Yann LeCun's profile page
  • go to John McCarthy's profile page
  • go to Fernando Corbato's profile page
  • go to Vinton Cerf's profile page
  • go to Adi Shamir's profile page
  • go to Kenneth Lane Thompson's profile page
  • go to Kenneth E. Iverson 's profile page
  • go to Edmund Clarke's profile page
  • go to Allen Newell 's profile page
  • go to Douglas Engelbart's profile page
  • go to Prof. Martin Hellman's profile page
  • go to Ivan Sutherland's profile page
  • go to John E Hopcroft's profile page
  • go to Leslie G Valiant's profile page
  • go to Robert E Kahn's profile page
  • go to Sir Tim Berners-Lee's profile page
A.M. TURING AWARD LAUREATES BY...

Birth: August 9, 1941 in Timmins, Ontario.

Education: B.A.Sc. in Engineering Physics (Toronto, 1963); Ph.D. in Electrical Engineering (Princeton, 1967). 

Experience: Bell Labs/Bellcore: Technical Staff (1967-80); Head of the Computing Principles Research Department (1980-1987); Director of the Computing Science Research Center (1987-1991); General Manager of the Information Sciences and Technologies Research Laboratory; Associate Research Vice President, Communications Sciences Research (1997-2002). Columbia University: Lawrence Gussman Professor of Computer Science (1995-2018), Emeritus (2018-Present).

Honors and Awards (selected): Bell Labs Fellow (1984), Fellow of the American Association for the Advancement of Science (1986), IEEE Fellow (1988), ACM Fellow (1996), Member of the National Academy of Engineering (1999), Member of the American Academy of Arts and Sciences (2013), IEEE John von Neumann Medal (2003), Fellow of the Royal Society of Canada (2013), NEC C&C Prize (2017), ACM A.M. Turing Award (2020), Member of the National Academy of Sciences (2022). Recipient of honorary doctorates from the University of Helsinki (1986), from the University of Waterloo (1992), and from the University of Toronto (2015).

Alfred Vaino Aho DL Author Profile link

United States – 2020
CITATION

For fundamental algorithms and theory underlying programming language implementation and for synthesizing these results and those of others in their highly influential books, which educated generations of computer scientists.

Alfred V. Aho was born in the northern Ontario town of Timmins but grew up in Toronto. His father was an immigrant carpenter from Finland and his mother, a secretary, was from Arizona. Neither parent attended college but they valued reading and music. Aho began playing violin as a small child, performed with the school orchestra of the North Toronto Collegiate Institute, and has continued to play throughout his life.[1]

Beginning in 1959 Aho studied engineering physics at the University of Toronto, Canada’s leading university. This was a particularly demanding course, which combined engineering with rigorous mathematics and physics. His first programming experience was writing assembly language on the university’s IBM 7094 mainframe.

In 1963, Aho graduated from Toronto and enrolled for graduate study at Princeton University. It would be decades before Princeton had a computer science department, despite the presence of faculty such as logician Alonzo Church and game theorist Oskar Morgenstern, both of whom Aho took courses from. But several younger faculty members in the Electrical Engineering department, where Aho studied, were building up teaching and research agendas in computing. Among them was John Hopcroft, himself a future Turing Award winner, who had been asked to teach a course in the area.

Aho remembers that both he and Hopcroft were first exposed to formal language theory, which became a major research area for them, after fellow Ph.D. student Jeffrey Ullman spent a summer with Seymour Ginsburg’s research group at the System Development Corporation. Aho’s doctoral dissertation was a contribution to the emerging body of work that classified the grammars of formal language theory into a hierarchy defined by the classes of automata able to recognize them. His thesis introduced the new category of indexed grammars, a generalization of the existing notion of context-free grammars, and described a new class of automata he called nested-stack automata that are able to recognize them.[2]

Aho explains indexed grammars, which he invented for his Ph.D. thesis.

On graduation in 1967 Aho joined Bell Labs as member of its computing research group. That was in some ways a natural continuation of his work at Princeton: the suburban Bell Labs campus was not far away, Ullman had recently been hired there, and Hopcroft sometimes spent his summers consulting with Bell Labs. The group he joined was headed by Doug McIlroy and is remembered primarily for its development during the 1970s of what became the Unix operating system and the C programming language.

Collaboration with Ullman

Aho is best known for the textbooks he wrote with Ullman, his co-awardee. The two were full time colleagues for three years at Bell Labs, but after going back to Princeton as a faculty member Ullman continued to work one day a week for Bell.

They retained an interest in the intersection of automata theory with formal language. In an early paper, Aho and Ullman showed how it was possible to make Knuth's LR(k) parsing algorithm work with simple grammars that technically did not meet the requirements of an LR(k) grammar. This technique was vital to the Unix software tools developed by Aho and his colleagues at Bell Labs. That was just one of many contributions Aho and Ullman made to formal language theory and to the invention of efficient algorithms for lexical analysis, syntax analysis, code generation, and code optimization. They developed efficient algorithms for data-flow analysis that exploited the structure of "gotoless" programs, which were at the time just becoming the norm.

Their early joint work in algorithm design and analysis techniques contributed crucial approaches to the theoretical core of computer science that emerged in this period. In its focus on basic methods for processing graphs, strings, and sequences, this algorithmic work was tightly integrated with their research on programming languages. Their foundational book with John Hopcroft, The Design and Analysis of Computer Algorithms, published in 1974, created the conceptual framework not just for teaching algorithms in the standard computer science curriculum for decades, but also for presenting and analyzing new algorithms developed by the research community. In addition to incorporating their own results, the book codified a range of disparate algorithms into a set of general design methods, including divide-and-conquer, recursion, dynamic programming, and others that have long since entered the standard toolbox of computer scientists.

Aho explains the difference between an algorithm and a procedure.

Aho and Ullman’s other crucial contribution of the 1970s came with their books on the theory of programming languages and compilers. They collected their research on computer languages into a two-volume series The Theory of Parsing, Translation, and Compiling, published in 1972-3. Aho says that in this book they “attempted to distill the essence of automata and language theory with applications to the process of compilation. And, I think this was one of the interesting demonstrations of putting a theoretical foundation onto an important practical area of computer science.” This provided “a theoretical foundation that could be used to design algorithms for the translation process. And, more importantly, people could then construct tools to help build components of compilers.”[3]

This laid the groundwork for their definitive textbook on compiler technology, Principles of Compiler Design, published in 1977. This "Dragon Book" (nicknamed after its colorful cover) and its subsequent editions with Ravi Sethi, and later with Sethi and Monica Lam, became the bibles of compiler design. These books lucidly lay out the phases in translating a high-level programming language to machine code, modularizing the entire enterprise of compiler construction.

Aho on how writing the Dragon books eventually impressed his kids.

The pair also helped to formalize the structuring of relational databases. Relational database management systems, based on a model for which Edgar Codd won a Turing award, are structured around the idea that each piece of information should be stored only once. Database queries specify how records stored in different tables (or “relations” in Codd’s formal model) should be recombined. Aho’s paper with Ullman and Catriel Beeri on "lossless joins" brought rigor to the study of “normal forms” for the storage of data without redundancy and gave a methodology for determining when it was possible to break a relation into smaller components without losing information.[4]

Contributions to Unix

Thanks to his position at Bell Labs, Aho was able to put into practice many of the ideas about compiler creation that he and Ullman had been writing about. The Unix operating system began as a grass roots project of two Bell Labs programmers, Dennis Ritchie and Ken Thompson to replicate the interactive editing and software development capabilities lost to them when the lab withdrew from the troubled project to create the Multics operating system. The modest official support allocated to their new initiative was justified by the promise that it would help the lab’s publication and patent staff produce documents more effectively, rather than by any grand commitment to deliver a commercial product. This left researchers free to shape the system’s development to serve their own needs and to test new approaches to software development. Thompson and Ritchie later shared their own Turing Award for their work on Unix, which is the basis for most of today’s widely used operating systems.

One of the key characteristics of Unix was what is often called the “software tools” philosophy. It stemmed naturally from the characteristics of the Unix project: the operating systems itself was a minimal, efficient, and portable core developed by a small team. Rather than charter huge groups to build complex programs to supplement this core the Unix group let individuals and small groups develop reusable tools that each did one thing well and efficiently. To get a particular task done users would chain the inputs and outputs of these programs together using a unique mechanism known as the “pipe.” The appeal of Unix, which rapidly became the dominant platform for academic and scientific computing during the late-1970s and 1980s, came as much from the collective power of these tools as the robustness of the operating system core.

Aho’s deep knowledge of the processing of formal language positioned him to contribute to several of the most important of these tools. Aho developed algorithms to efficiently locate text strings matching simple rules, expressed as regular expressions. The algorithms were incorporated into the Unix tool egrep and later into the lexical analyzer lex, part of a set of tools that made the development of compilers much faster and easier. Another tool, fgrep, searched inputs for keywords at great speed by generating special-purpose automata able to hunt for multiple keywords in a single pass through the input. This relied on the Aho-Corasick algorithm, named after its original invention for a bibliographical project run by Margaret J. Corasick.[5]

Yacc, for Yet Another Compiler Compiler, was programmed by Steve Johnson to implement the LR(k) parsing techniques studied by Aho and Ullman. It generates parser code automatically based on descriptions of the grammar of the language to be recognized.

Aho tells how Lex and YACC were created and why they matter.

Aho recalled its creation as a response to the frustrating experience of using manual techniques to develop a parser for the C language, which was another product of the Unix project: “I had a… big piece of cardboard on which I carried out the sets of items construction, usually while I was watching television, because it is such a trivial and mind-numbing task…. I gave this piece of cardboard to Steve [Johnson], and Steve would then encode it into his computer program. After a while he became so frustrated with me, that I couldn’t get it a hundred percent right, he wrote a program that automated this parser construction technique. And that how the tool Yacc was born.”[6]

The Unix AWK tool, named for its creators Aho, (Peter) Weinberger and (Brian) Kernighan, is a specialized programming language based around Aho’s algorithm for processing regular expressions. AWK programs are defined as sets of actions. The program reads sequentially through its input, carrying out the actions defined in each rule whenever it encounters a block of characters that matches the corresponding pattern.

Aho tells how he became the A in AWK.

Aho’s motivation was to produce a language that, together with the other Unix tools, would allow the rapid creation of very short programs to process data held in text files. Thanks to the flexibility of Unix and its pipe mechanism, AWK found many more uses in processing and modifying text streams. AWK and its derivatives are still used today, though in recent decades Perl has been an increasingly popular alternative.

Columbia University

In 1995 Aho became a professor of computer science at Columbia, beginning with a two year stint as department chair. Recalling that decision, Aho said “I was managing an organization of about 200 people at Bellcore. How hard it could be to manage 25 people in academia? And the answer is much harder.”[7] He stayed at Columbia for the rest of his career, retiring in 2018, though he retained ties with Bell Labs including five years as a senior manager of research for which he took a leave of absence from Columbia.

At Columbia, Aho continued to work on languages and compilers, establishing a research group in the area and teaching a popular course in which students were required to design, implement, and document their own programming language. He also worked on quantum computing. As well as updating their classic textbook on compiles with new coauthors, he and Ullman wrote a new textbook, Foundations of Computer Science (1992), which gave an introduction to computer science grounded in theory and mathematics.

His advice to a young person is “you can never learn too much mathematics… Learn the fundamentals of whatever field you're in because the fundamentals don’t go out of style nearly as fast... And when you graduate, try to get a job working with the best people in the field because you learn so much from these people, and then you'll be well-positioned for your next job.”[8]

Author – Thomas Haigh with ACM Staff



[1] The biographical details in this paragraph, and in several other parts of the profile, are drawn from Hansen Hsu’s oral history interview with Aho, conducted on June 13, 2022 for the Computer History Museum in collaboration with ACM.

[2] AV Aho, “Indexed grammars—an extension of context-free grammars” Journal of the ACM 15 (4), 647-671.

[3] Oral history interview by Michael S. Mahoney, circa 1989.  https://www.princeton.edu/~hos/mike/transcripts/aho.htm

[4] Alfred V. Aho, Catriel Beeri, Jeffrey D. Ullman: The Theory of Joins in Relational Databases. ACM Trans. Database Syst. 4(3): 297-314 (1979).

[5] A.V. Aho & M.J. Corasick (June 1975). "Efficient string matching: An aid to bibliographic search". Communications of the ACM. 18 (6): 333-340. 

[6] Oral history interview by Michael S. Mahoney, circa 1989.  https://www.princeton.edu/~hos/mike/transcripts/aho.htm

[7] Oral history interview by Hansen Hsu, June 13, 2022.

[8] Oral history interview by Hansen Hsu, June 13, 2022.