• go to Martin Hellman 's profile page
  • go to Douglas Engelbart's profile page
  • go to Robert Melancton Metcalfe's profile page
  • go to Geoffrey E Hinton's profile page
  • go to William Kahan's profile page
  • go to Juris Hartmanis's profile page
  • go to John E Hopcroft's profile page
  • go to Edmund Clarke's profile page
  • go to Shafi Goldwasser 's profile page
  • go to Ivan Sutherland's profile page
  • go to John McCarthy's profile page
  • go to Leonard M. Adleman's profile page
  • go to Manuel Blum's profile page
  • go to Yann LeCun's profile page
  • go to Amir Pnueli's profile page
  • go to Donald E. Knuth's profile page
  • go to Leslie Lamport's profile page
  • go to Pat Hanrahan's profile page
  • go to Alan Kay's profile page
  • go to A J Milner 's profile page
  • go to Charles W Bachman's profile page
  • go to Kenneth E. Iverson 's profile page
  • go to Barbara Liskov's profile page
  • go to Raj Reddy's profile page

Yann LeCun DL Author Profile link

United States – 2018
Short Annotated Bibliography
  1. LeCun,Y,  B.Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, L. D. Jackel. “Backpropagation applied to handwritten zip code recognition,” Neural Computation vol. 1, no. 4, December, 1989, pp. 541-551.
    Written soon after LeCun’s arrival at Bell Labs, this paper describes the successful application by the Adaptive Systems Research department of the new back-propagation techniques developed by LeCun in his doctoral thesis and, independently, but his co-awardee Geoffrey Hinton. The zip code recognition described here was an early stage of the work later applied to automated check clearing.
  2. LeCun, Y., L. Bottou, Y. Bengio, and P. Haffner (1998) “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, pp. 2278–2324.
    Describes the pathbreaking use of convolutional neural networks to handwriting by a Bell Labs group including LeCun’s co-awardee Yoshua Bengio. The paper introduced graph transformer networks, a new approach able to train networks composed of specialized modules. It uses Bell Labs’ successful check recognition system as its main example.
  3. LeCun, Y., J. S. Denker, S. A. Solla. “Optimal Brain Damage,” Advances in Neural Information Processing Systems (NIPS) vol. 2, 1989, pp. 598-605.
    LeCun worked with his Bell Labs colleagues to devise a method to simplify neural networks by removing unnecessary connections. Done properly, this “brain damage” could produce simpler, faster networks that performed as well or better than the full-size version.
  4. Y. LeCun, Bengio, Y. and G. E. Hinton. “Deep Learning,” Nature, vol. 521, 2015 pp 436-444.
    A recent and accessible summary of the methods that LeCun and his co-winners termed “deep learning,” because of their reliance on neural networks with multiple, specialized, layers of neurons between input and output nodes. It addressed a surge of interest in their work following the successful demonstration of these methods for object categorization, face identification, and speech recognition.