• go to Leonard M. Adleman's profile page
  • go to Dr. Jack Dongarra's profile page
  • go to Maurice V. Wilkes's profile page
  • go to Vinton Cerf's profile page
  • go to Edsger W. Dijkstra's profile page
  • go to Edmund Clarke's profile page
  • go to Jim Gray 's profile page
  • go to Marvin Minsky 's profile page
  • go to Barbara Liskov's profile page
  • go to E. Allen Emerson's profile page
  • go to Richard Karp's profile page
  • go to Amir Pnueli's profile page
  • go to Kenneth Lane Thompson's profile page
  • go to Allen Newell 's profile page
  • go to Martin Hellman 's profile page
  • go to Pat Hanrahan's profile page
  • go to Stephen A Cook's profile page
  • go to Shafi Goldwasser 's profile page
  • go to Robert E Kahn's profile page
  • go to Dana S Scott's profile page
  • go to J. H. Wilkinson 's profile page
  • go to Jeffrey D Ullman's profile page
  • go to Alan Kay's profile page
  • go to A J Milner 's profile page
A.M. TURING AWARD LAUREATES BY...
BIRTH:

August 4, 1932, Peru, New York

DEATH:

August 4, 2020

EDUCATION:

BA (mathematics, New York State College for Teachers, 1954); summer courses at Columbia University; MA (mathematics, University of Michigan, 1957); six honorary Doctor of Science degrees (University of Alberta, 1991; Pace University, 1999; University of Illinois Champaign-Urbana, 2004; University of Michigan, Ann Arbor, 2008; State University of New York at Albany, 2008; McGill University, 2009); honorary Doctor of Engineering degree (University of Notre Dame, 2008)

EXPERIENCE:

High school math teacher for two years; IBM (International Business Machines) Corporation (teaching FORTRAN to company scientists and engineers, 1957-1959; design and management of compiler for Stretch/Harvest computer, 1959-1963; design and development of experimental optimizing compiler for IBM ACS project, 1963-1968; compilers for Future Systems (FS) project, 1970-1973; visiting professor (partly on sabbatical from IBM) at New York University, 1970-1973; initiator and manager of Experimental Compiling Systems project, 1973-1980; development of product compilers, 1980-1985; initiator and manager of Parallel Programming Technology project, 1984-1994; IBM Fellow and President of the IBM Academy of Technology, 1995-1996; IBM Fellow and Senior Technical Advisor, SAS and BlueGene project, 1996-2002; IBM Fellow Emerita, from 2002)

HONORS AND AWARDS:

IBM Corporate Award for Algorithms for Optimizing Compilers (1968); Member of the US National Academy of Engineering (1987); IBM Fellow (1989); IEEE Fellow (1991); ACM Fellow (1994); Fellow, American Academy of Arts and Sciences (1994); IEEE Computer Society Charles Babbage Award (1997); Women in Technology International (WITI) Hall of Fame (1997); Fellow, Computer History Museum (2000); Member, American Philosophical Society (2001); Association for Women in Computing Ada Lovelace Award (2002); IEEE Computer Society Computer Pioneer Award (2004); Anita Borg Technical Leadership Award (2004); ACM Turing Award (2006); Member of the US National Academy of Sciences (2010)
 

Frances ("Fran") Elizabeth Allen DL Author Profile link

United States – 2006
CITATION

For pioneering contributions to the theory and practice of optimizing compiler techniques that laid the foundation for modern optimizing compilers and automatic parallel execution. press release

Frances Elizabeth ("Fran") Allen was born August 4, 1932. She was the oldest of six children and grew up on a farm in Peru, New York, near Lake Champlain. Her father was a farmer, and her mother an elementary-school teacher.

In high school she was greatly inspired by her math teacher, and set out to become a math teacher herself. She attended the New York State College for Teachers (now the State University of New York at Albany) for four years and earned a BA in mathematics with a minor in physics as well as education credits. She then taught math -- everything from elementary algebra to advanced trigonometry -- for two years at the same high school she had attended.

She realized that she would need a master's degree to be fully certified as a teacher. After taking some summer courses at Columbia University, she enrolled at the University of Michigan at Ann Arbor and earned an MA in mathematics. She also took courses in computing there—some of the first ever offered—and learned how to program an IBM 650 from Bernard Galler, who was a co-developer of the MAD programming language and later President of the ACM and an ACM Fellow. IBM held job interviews on the Michigan campus and offered her a job in research. She accepted with the idea that she would earn enough money to pay off her student debts and then return to teaching. Instead, she stayed at IBM for the next 45 years.

She joined IBM on July 15, 1957, exactly two months after the FORTRAN programming language had been released. Her first assignment was to teach research scientists within IBM how to use this language and indirectly encourage IBM customers to use it. She did what teachers often must: she learned the subject matter just a few days ahead of her students. As part of this process, she read the source code for the FORTRAN compiler that had been developed by John Backus (later a Turing Award winner) and his team. In her words, "It set my interest in compiling, and it also set the way I thought about compilers, because it was organized in a way that has a direct heritage to modern compilers."

Allen discusses her recruitment by IBM in 1955 and first assignment there teaching FORTRAN.

She spent most of the rest of her career developing cutting-edge programming language compilers for IBM Research. At first she worked on the IBM 704’s Monitored Automatic Debugging operating system (developed by Roy Nutt, who had also implemented the FORMAT statement for the FORTRAN compiler), but her first major project was for the Stretch/Harvest computer.

Allen describes IBM’s STRETCH computer, by far the fastest in the world when the first one was delivered in 1961.

Stretch was one of the first supercomputers, and Harvest was a coprocessor for Stretch that had been designed for the US National Security Agency (NSA) to do codebreaking of secret messages. Allen and her team designed a single compiler framework to handle three very different programming languages: FORTRAN, Autocoder (a business language similar to COBOL), and the new language Alpha (designed for rapidly detecting patterns in arbitrary text represented in any alphabet). The three language compilers shared a common optimizing back end that could produce code for both the Stretch supercomputer and its Harvest coprocessor. This was an extraordinarily ambitious effort for the time, and they pulled it off. Allen served as the liaison between IBM and NSA, coordinating the design of the Alpha language and its acceptance tests. She spent a year in 1962 at NSA overseeing the installation and testing of the system, which was used for 14 years before being retired in 1976.

Allen discusses her work developing compilers for IBM’s STRETCH computer, its HARVEST application, and its performance challenges.

At that point she was offered the opportunity to coordinate programming language work for the IBM System/360, but that would have required a lot of traveling. After all the traveling she had done while working on Harvest, she decided to decline that offer and instead return to IBM Research at the new T. J. Watson Research building in Yorktown, New York. She joined Project Y, which allowed her to collaborate once again with John Cocke, another future Turing Award winner who had worked on the hardware for Stretch. Project Y, later called the Advanced Computing Systems project (ACS), included another set of cutting-edge advances in computer system design. The hardware might be described in today's terms as the first "superscalar" processor, where the central processing unit does not execute instructions one at a time as in previous designs, but works on several instructions at once, perhaps even performing instructions "in the wrong order" to get the work done more quickly. The compiler techniques for this project represented equally novel technological improvements, including new "flow analysis" techniques that allowed the compiler to automatically optimize programs for greatly improved performance.

A key advance was to represent programs within the compiler not as a sequence of statements as in the original source code, but as a mathematical graph that could be analyzed to discover hidden properties of the code -- such as that a computed value could be re-used in another region of the code, or that it would definitely not be needed in yet another region of code. The techniques involved two clever tricks: labeling edges of the graph with mathematical sets, and then representing those sets with a very compact data structure that required only a single bit (0 or 1) of storage for each member of the set. That allowed the sets to be processed rapidly. In addition, a mathematical technique for decomposing the graphs into "intervals" allowed the sets attached to the edges to be processed in an order that usually resulted in the greatest efficiency. These techniques developed for the ACS compilers allowed both the compilers themselves and the programs they processed to execute much faster than in previous systems.

Allen was then assigned to work on IBM’s doomed "Future Systems" (FS) project, which aimed to revolutionize the way IBM built computer systems. She thought the machine architecture was technically flawed in ways that would limit performance. She wrote a letter to IBM management saying so, and recalls that it "was kind of put on the shelf for a while." After four years, the project was killed.

She took a sabbatical from IBM to teach graduate courses on compilers at the Courant Institute for Mathematical Sciences at New York University at the invitation of Jacob "Jack" Schwartz, creator of the SETL programming language and the NYU Ultracomputer at the Courant Institute. Schwartz had previously visited IBM to collaborate with her and John Cocke on ACS. Later, she and Schwartz were married.

Her next major project for IBM was the Experimental Compiler Systems project (ECS). This system, like the earlier compiler framework for Stretch/Harvest, was designed to support multiple programming languages. But the primary focus was on a new language called PL/I, which presented much more difficult problems for an optimizing compiler. ECS featured aggressive interprocedural analysis, procedure inlining, an extensive collection of optimizing transformations, and a runtime environment that allowed the free mixing of interpreted code and optimized compiled code.

Allen's last big project for IBM was the Parallel Translator (PTRAN), a system for compiling Fortran programs not specially written with parallelism in mind for execution on parallel computer architectures. For this effort she consulted with David Kuck (later an ACM Fellow) at the University of Illinois, who worked for many years on parallelizing compilers. She eventually hired some of Kuck’s students, including Ron Cytron (later an ACM Fellow). She applied her extensive experience with interprocedural flow analysis to produce new algorithms for extracting parallelism from sequential code. PTRAN introduced the concept of the program dependence graph, a representation now used by many parallelizing compilers.

Allen discusses her work on parallel compilers, in collaboration with academic research teams.

Allen was named an IBM Fellow in 1989, an IEEE Fellow in 1991, and an ACM Fellow in 1994. She retired from IBM in 2002. As an IBM Fellow Emerita, she has continued to advise IBM on a number of projects including the Blue Gene supercomputer, and has worked to encourage the involvement of other women in computer-related fields.

Allen has traveled internationally not only to give lectures about compilers, but also for athletic pursuits: she is a veteran of many mountain-climbing expeditions in Austria, China, Tibet, and elsewhere.

Fran Allen's focus has not been on inventing new programming languages or language features and then trying to get people to program using them. Rather, she focused on taking programs as programmers like to write them, and made them run efficiently by doing sophisticated analysis and optimization of the code. She didn’t create paper designs, but a series of working systems that run real programs, not just artificial benchmarks, faster. Today's programming language compilers still rely on techniques that she pioneered. 

Author: Guy Steele