• go to Charles W Bachman's profile page
  • go to Frances Allen's profile page
  • go to Adi Shamir's profile page
  • go to Ronald L Rivest's profile page
  • go to Edgar F. Codd's profile page
  • go to Butler W Lampson's profile page
  • go to C. Antony R. Hoare 's profile page
  • go to Whitfield Diffie 's profile page
  • go to Alfred V Aho's profile page
  • go to A. J. Perlis 's profile page
  • go to Michael Stonebraker's profile page
  • go to Juris Hartmanis's profile page
  • go to Jeffrey D Ullman's profile page
  • go to Marvin Minsky 's profile page
  • go to Martin Hellman 's profile page
  • go to Robert E Tarjan's profile page
  • go to Robert E Kahn's profile page
  • go to Richard E Stearns's profile page
  • go to Leslie Lamport's profile page
  • go to Frederick Brooks's profile page
  • go to Manuel Blum's profile page
  • go to Geoffrey E Hinton's profile page
  • go to Robert W. Floyd's profile page
  • go to Kenneth Lane Thompson's profile page

Yann LeCun DL Author Profile link

United States – 2018
Short Annotated Bibliography
  1. LeCun,Y,  B.Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, L. D. Jackel. “Backpropagation applied to handwritten zip code recognition,” Neural Computation vol. 1, no. 4, December, 1989, pp. 541-551.
    Written soon after LeCun’s arrival at Bell Labs, this paper describes the successful application by the Adaptive Systems Research department of the new back-propagation techniques developed by LeCun in his doctoral thesis and, independently, but his co-awardee Geoffrey Hinton. The zip code recognition described here was an early stage of the work later applied to automated check clearing.
  2. LeCun, Y., L. Bottou, Y. Bengio, and P. Haffner (1998) “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, pp. 2278–2324.
    Describes the pathbreaking use of convolutional neural networks to handwriting by a Bell Labs group including LeCun’s co-awardee Yoshua Bengio. The paper introduced graph transformer networks, a new approach able to train networks composed of specialized modules. It uses Bell Labs’ successful check recognition system as its main example.
  3. LeCun, Y., J. S. Denker, S. A. Solla. “Optimal Brain Damage,” Advances in Neural Information Processing Systems (NIPS) vol. 2, 1989, pp. 598-605.
    LeCun worked with his Bell Labs colleagues to devise a method to simplify neural networks by removing unnecessary connections. Done properly, this “brain damage” could produce simpler, faster networks that performed as well or better than the full-size version.
  4. Y. LeCun, Bengio, Y. and G. E. Hinton. “Deep Learning,” Nature, vol. 521, 2015 pp 436-444.
    A recent and accessible summary of the methods that LeCun and his co-winners termed “deep learning,” because of their reliance on neural networks with multiple, specialized, layers of neurons between input and output nodes. It addressed a surge of interest in their work following the successful demonstration of these methods for object categorization, face identification, and speech recognition.