• go to Ronald L Rivest's profile page
  • go to Edsger W. Dijkstra's profile page
  • go to John L Hennessy's profile page
  • go to Frances Allen's profile page
  • go to Avi Wigderson's profile page
  • go to William Kahan's profile page
  • go to Shafi Goldwasser 's profile page
  • go to David Patterson's profile page
  • go to Leslie Barry Lamport's profile page
  • go to Alan Kay's profile page
  • go to Edgar F. Codd's profile page
  • go to C. Antony R. Hoare 's profile page
  • go to Richard E Stearns's profile page
  • go to Edward A Feigenbaum's profile page
  • go to Leslie G Valiant's profile page
  • go to Juris Hartmanis's profile page
  • go to Kenneth Lane Thompson's profile page
  • go to Richard Karp's profile page
  • go to Yoshua Bengio's profile page
  • go to Butler W Lampson's profile page
  • go to Raj Reddy's profile page
  • go to Maurice V. Wilkes's profile page
  • go to Dennis M. Ritchie 's profile page
  • go to Robert Melancton Metcalfe's profile page
A.M. TURING AWARD LAUREATES BY...
BIRTH:

July 5, 1936 in Caldwell, New Jersey

EDUCATION:

B.A. in Mathematics (Carleton College, 1958); Ph.D. in Mathematics (Princeton University, 1961).

EXPERIENCE:

General Electric Research Laboratory (1961-1978); State University of New York at Albany (1978-2000, Chair, Computer Science Department,1982-1989); Editor, SIAM Journal of Computing (1972-1988); Member-at-Large, SIGACT Executive Committee (1973-1975); Visiting Professor, Hebrew University (1975); Adjunct Professor, Rensselaer Polytechnic Institute (1977-1978); Visitor, Mathematical Science Research Institute (1985).

HONORS AND AWARDS:

Lanchester Prize in Operations Research, co-winner (1977); ACM Alan M. Turing Award (1993); ACM Fellow (1994); Distinguished Professor, State University of New York (1994). 

 

Richard ("Dick") Edwin Stearns DL Author Profile link

United States – 1993
CITATION

With Juris Hartmanis, in recognition of their seminal paper which established the foundations for the field of computational complexity theory.

 

Richard (“Dick”) Edwin Stearns, born on July 5, 1936 in Caldwell New Jersey, was the son of Dr. Edwin I. Stearns and Winifred T. Scales. He married Charlotte A Reed in 1963. They have two grown children, Chris R. Stearns and Dr. Elizabeth R. Gumustop.

Stearns received a B.A. in Mathematics from Carleton College in 1958. In 1961, he received a Ph.D. in Mathematics from Princeton University. His thesis, Three Person Cooperative Games without Side Payments, was done under the supervision of Harold W. Kuhn with mentoring from Robert J. Aumann.

In the summer of 1960 Stearns worked at the General Electric Research Laboratory in Schenectady N.Y., where he and Juris Hartmanis started joint work on the state assignment problem. After receiving his Ph.D. in 1961, Stearns joined Hartmanis as a permanent employee of the GE Research Laboratory’s Information Studies Branch, managed and led by Richard L. Shuey. The atmosphere and environment at the Research Lab encouraged the free and unbounded type of research they both enjoyed.

Stearns describes his first exposure to computers and to computer science.

Stearns and Hartmanis initially worked on the decomposition of sequential machines: how models of simple computers can be decomposed into a set of smaller sequential machines that accomplish the same tasks. They published several papers on the subject, and in 1966 summarized their work in a book [1].  Later they worked on computational complexity, and in 1965 published the paper for which they received the Turing award [4].

The seminal paper “On the Computational Complexity of Algorithms” [4] mentioned in the citation for the Turing Award was written with Juris Hartmanis in 1965. It provides a precise definition of the complexity of an algorithm, defines the concept of a complexity class, and shows that there is an infinite sequence of distinct complexity classes and therefore an infinite sequence of increasingly hard problems.

Stearns describes working with Hartmanis at GE to write “On the Computational Complexity of Algorithms”.

An algorithm is a step-by-step procedure that takes inputs and produces corresponding outputs. For example, a sorting algorithm will take a list of numbers as input and output the list in numerical order. Such an algorithm is said to solve the sorting problem. Another algorithm might take a number as an input and output YES or NO depending on whether or not the input number is a prime number. Such an algorithm is said to solve the problem of primality testing. The complexity of an algorithm is based on the number of steps required, as a function of the input size, for the algorithm to solve a problem. Such a function is called a time bound. For mathematical purposes, algorithms are imagined to be executed on a Turing machine, a simple model of a computer. The complexity of a problem is the complexity of the fastest algorithm for solving that problem.

Hartmanis and Stearns defined a complexity class as the set of all problems solvable within a specified time bound. Their paper [4] shows that there is an infinite hierarchy of complexity classes (for example, problems for which the fastest algorithm takes a time proportional to n, n log n, n2, n3, 2n, and so on) where a small increase in the time bound enables more problems to be solved.

After Hartmanis left GE to become Chair of the Computer Science Department at Cornell University, Stearns worked with Daniel J. Rosenkrantz and Philip M. Lewis. One important paper [5] (with Philip M. Lewis) showed that a similar hierarchy exists when the complexity is defined in terms of the amount of memory space required (as a function of input size) to solve the problem on a Turing machine. Changing the simple model of a Turing Machine by separating the “input tape” from the “work tape” allowed sub-linear space-bounded complexity classes to be defined.

Stearns describes working with Lewis to investigate the space complexity of algorithms.

These papers led to the establishment of computational complexity theory as a fundamental part of the computer science discipline. Many computer scientists have written papers extending and enhancing the theory. Although the reasoning and the results of many of these papers are highly theoretical, they have practical implications for real-world problems. Many computer scientists and software engineers have used these results in the design and implementation of practical systems.

One of the implications of the hierarchy of complexity classes is that there are many real-life problems whose complexity is so high that it is not practical to solve them. The work on approximating computationally complex problems[9], showed that some problems that would take an impractically long time to compute exactly can be approximately solved by algorithms that take a much shorter time yet yield a solution that is provably close to the optimal solution.

Rosenkrantz, Lewis, and Stearns published a number of papers and, in 1976, a book [2] on compiler design theory. A compiler is the program that translates programs written in a high-level programming language into the machine code of a particular computer. One compiler design problem is the problem of parsing a program in the programming language. For natural languages, parsing a sentence involves finding the subject, the verb, the object, etc. Programming languages are usually expressed in terms of context free grammars, and programs written using such grammars must be parsed by the compiler. The computational complexity of parsing a “sentence” (program) of length n written using a context free grammar is n3, which is too high to be practical in a compiler. The authors defined a special class of context-free grammars, LL(k) grammars, that can be parsed in linear time, where the complexity is only n. The method they described for parsing LL(k) grammars is called top-down parsing and is now an important part of compiler design.  They also showed that much of the desired translation from a programming language into a computer language can be done at the same time the parsing is being done.

After Stearns left GE in 1978 to go to the State University of New York (SUNY) at Albany (Rosenkrantz had left to go to Albany in 1977), he continued working with Lewis. They did research on concurrent database systems, where a number of transactions simultaneously read and write items in the same database. One goal of such systems is for the transactions to run as concurrently as possible to increase the throughput of the system, but without destroying the correctness of the database.  Their paper on this subject [10] shows that a necessary and sufficient condition for consistency and correctness in concurrent execution is serializability of the transactions—that is, the overall effect of the read and write requests of all the concurrently-running transactions must be the same as if the transactions had been run serially (one after the other) in some order. Previously it had been known that serializability is a sufficient condition for consistency:  if the execution is serializable, it is correct. This paper showed that, with the exception of certain read-only transactions, it is also a necessary condition: the execution must be serializable to be correct. This result is now a key part of all database courses and the design of all database systems.

At SUNY Albany, Stearns began a long collaboration with Harry B. Hunt III. Problems of interest are often proven hard by “reductions” (mappings) from other problems known to be hard (assuming P ≠ NP). Hunt and Stearns studied deeper issues involving reductions. They looked for reductions that mapped to simple subsets of instances, because instances of practical interest might all be simple. They also looked for reductions of small size because the smaller the size, the stronger the conclusions that can be drawn about complexity. They formalized some of these ideas with the concept of a power index. Rather than looking for reductions one at a time, they looked for reduction principles which could be applied to many kinds of objects at once, particularly to various kinds of algebras. In this way, they showed that many problems were hard for the same simple reason.

Stearns recounts his work with Harry Hunt III on computational complexity.

Hunt and Stearns studied sum-of-products problems where the plus and times operators are from commutative semi-rings. They showed that such problems can be solved with a reduced number of operations if the problem had good structure as displayed in a structure tree. “Good structure” here means small “weighted depth” for top-down processing or small “channelwidth” for bottom-up processing. Channelwidth is a manifestation of “treewidth” if sum-of-product problems are looked at graphically, but the structure tree concept provides clarity if the problems are looked at algebraically. Then, with the addition of an algebraic condition, they extended the structure tree concept to apply to quantified formulas. They also established a tight connection between sub-linear space and sub-linearly treewidth bounded OR-of-AND problems.

In collaboration with S.S. Ravi, Harry Hunt III, Daniel Rosenkrantz, and Madhav Marathe, Stearns has been working on Dynamical Systems.

In September 2000, Stearns retired to live with his wife in Slingerlands, N.Y. He still visits and collaborates with former colleagues at the University.

Author: Philip M. Lewis