• go to Frederick Brooks's profile page
  • go to Edgar F. Codd's profile page
  • go to Fernando Corbato's profile page
  • go to Charles W Bachman's profile page
  • go to Raj Reddy's profile page
  • go to Joseph Sifakis's profile page
  • go to Leonard M. Adleman's profile page
  • go to Ronald L Rivest's profile page
  • go to Ivan Sutherland's profile page
  • go to Edmund Clarke's profile page
  • go to A J Milner 's profile page
  • go to Kristen Nygaard 's profile page
  • go to Frances Allen's profile page
  • go to Edsger W. Dijkstra's profile page
  • go to Leslie G Valiant's profile page
  • go to Yoshua Bengio's profile page
  • go to Robert E Tarjan's profile page
  • go to Richard E Stearns's profile page
  • go to Robert W. Floyd's profile page
  • go to Alfred V Aho's profile page
  • go to Shafi Goldwasser 's profile page
  • go to A. J. Perlis 's profile page
  • go to Prof. Martin Hellman's profile page
  • go to Yann LeCun's profile page
A.M. TURING AWARD LAUREATES BY...
 
BIRTH:

July 5, 1928 in Riga, Latvia

DEATH:

July 29, 2022, in Ithaca, NY

EDUCATION:

Cand. Phil., University of Marburg (1949, Physics); M.A., University of Kansas City (1951, Mathematics); Ph.D., California Institute. of Technology (1955, Mathematics)

EXPERIENCE:

Instructor, Cornell University (1955-1957); Assistant Professor, Ohio State University (1957-1958); Research Mathematician, General Electric Research Laboratory (1958-1965); Professor, Computer Science Department, Cornell University (1965-); Chairman, Computer Science Department, Cornell University (1965-1971, again 1977-1982, and 1992-1993); Walter R. Read Professor of Engineering, Cornell University (from 1980)

HONORS AND AWARDS:

Association for Computing Machinery Turing Award (with R.E. Stearns), 1993; Member: National Academy of Engineering, 1989; Member: Latvian Academy of Sciences, 1990 (foreign member); Member: New York State Academy of Sciences, 1982; Fellow: Association for Computing Machinery, 1994; Fellow: American Academy of Arts & Sciences, 1992; Fellow: American Association for the Advancement of Science, 1981; Humboldt Foundation Senior US Scientist Award, 1993-94; B. Blozano Gold Medal of the Academy of Sciences, Czech Republic, 1995; CRA Distinguished Service Award, 2000; Grand Medal, Latvian Academy of Science, 2001; he also has received two honorary doctorates: University of Dortmund, Germany, 1995, and University of Missouri, Kansas City, 1999. 
 

Juris Hartmanis DL Author Profile link

United States – 1993
CITATION

With Richard E. Stearns, in recognition of their seminal paper which established the foundations for the field of computational complexity theory.

Juris Hartmanis was the son of a senior Latvian army officer who was arrested when the Soviet Union occupied Latvia during World War II and subsequently died in prison.

Hartmanis discusses the death of his father following the Soviet occupation of Lithuania in 1940.

At the end of the War, the remaining members of the family moved to Germany where Juris obtained an undergraduate degree in physics from the University of Marburg in 1949. He then moved to the United States to begin graduate work and eventually received an MA in 1951 and a PhD in 1955, both in mathematics.

Juris began his career with posts as an Instructor at Cornell University (1955–1957) and later as an Assistant Professor at Ohio State University (1957–1958). He then took a position as a Research Mathematician at General Electric Research Laboratory, which lasted for ten years.

In the last few years of his time at General Electric, he and Richard Stearns became interested in how much time and memory are needed to perform different computations – a field that they named computational complexity. They defined different classes of calculations according to how much calculation and/or memory space each would require, which is a measure of their difficulty. Their famous 1965 joint paper [7] began to interest other computer scientists in looking at complexity the same way.

Hartmanis discusses his decade working for General Electric where he began his collaboration with Dick Stearns on complexity research.

In contrast to other approaches to complexity theory, Hartmanis and Sterns based their analysis on measuring the resources that algorithms use when run on specific machines. In order to have some generality, they used a Turing Machine as their basic model. They were pleasantly surprised to discover a number of important mathematical theorems related to complexity analysis.

The full Turing Award citation for Hartmanis and Stearns notes:

In their paper On the Computational Complexity of Algorithms they provided a precise definition of the complexity measure defined by computation time on Turing machines and developed a theory of complexity classes. The paper sparked the imagination of many computer scientists and led to the establishment of complexity theory as a fundamental part of the discipline.

They introduced a basic concept called a computational complexity class. Informally, a class represents all the computations that can be done using a given amount of resources. For example, an individual class might contain all problems that use N2 steps, where N is the size of the problem. Some of today’s most important theoretical problem areas are the relations between these classes, such as the relation between the class of problems that can be solved in an amount of time that can be expressed as a polynomial function of N, and those that scale more quickly than can be represented by a polynomial. The classic unsolved P=NP problem asks whether there are problems whose answers can checked in polynomial time but will always take longer to solve, no matter how clever we are.

Hartmanis discusses his work with Stearns to define complexity classes and its relationship to Turing machine experiments.

Hartmanis’ work on the foundations of complexity theory was instrumental in establishing computer science as a formal discipline distinct from mathematics, physics and electrical engineering. In the context of later work by Michael Rabin, Manuel Blum and others, Hartmanis and Stearns, in their seminal 1964 (conference) and 1965 (journal) papers [5, 6], provided the basis for the development of complexity theory as it is now studied. In particular, they showed how Turing’s work in defining the limitations of “what is computable” could be extended to a model for “what is efficiently computable”.

Many seminal results and insights were included in these two influential early papers. The multi-tape Turing machine was shown to be a precise and helpful model for sequential time complexity analysis. They showed the importance of asymptotic behavior, and the use and extension of Yamada’s real time counting functions [1] to establish the existence of a hierarchy of complexity classes.

The time-bounded complexity results were soon followed by a Lewis, Stearns and Hartmanis paper [8]  establishing a similar hierarchy for space-bounded computation. In addition to the tighter hierarchy results, this paper defined a precise concept for sublinear space, and showed that problems such as context free language recognition could be solved with O(log2 n) space. The divide-and-conquer method used in this result was the starting point for John E. Savage’s deterministic S2 space simulation of any non-deterministic space S ≥ log n computation. This in turn led to simulations of the tradeoff between parallel time and space, reinforcing the importance of sublinear space and the question of whether or not non-deterministic space can be deterministically simulated without any loss of efficiency.

Hartmanis discusses work on the P=NP problem and its relationship to nondeterminism carried out with Neil Immerman.

Hartmanis established complexity theory as the dominant theme in theoretical computer science with his major results concerning the structural nature of NP sets. In particular, he explored the question of whether or not all NP complete sets were isomorphic, i.e., whether all NP complete sets are basically the same set.Hartmanis and his student Leonard C. Berman showed that all natural NP complete sets are isomorphic (under polynomial time reductions), and further showed that complete sets computable in exponential time cannot be sparse.

This Berman–Hartmanis conjecture is important. As Hartmanis himself explained it:

We conjectured that they are polynomial time isomorphic, which is a strictly defined term; that in structure they are basically identical—that one is just a permutation of another. And in some sense this conjecture could be used to date what is now known as structural complexity theory. And structural complexity theory refers to relating the structure of the different complexity class to each other. So one does not ask about less specific algorithms, but more interested in what are big, more global, structural relationships, like the question of do all NP complete sets really look the same, or are there different ones. [2]

Hartmanis’ seminal research paralleled his leadership as chair of the Cornell computer science department and his active participation in numerous advisory boards.

Hartmanis believes that computational complexity—the study of the quantitative laws that govern computation—is an essential part of the science needed to guide and exploit computer technology.

Author: Allan Borodin



[1] H. Yamada, “Real-time computation and recursive functions not real-time computable,” IRE Transactions, EC-ll (1962), pp. 753-760.

[2] From an IEEE transcript of an oral history interview of Juris Hartmanis conducted in 1991 by Andrew Goldstine, IEEE History Center, New Brunswick, NJ, USA, available from

http://www.ieeeghn.org/wiki/index.php/Oral-History:Juris_Hartmanis