Natural language processing

  • Jeffrey R. Sampson
Part of the Texts and Monographs in Computer Science book series (MCS)


Despite the apparent lack of effect, people frequently talk to their machines. To replace such fruitless monologs with productive dialogs is probably the most important and most ambitious goal of artificial intelligence. Since nearly all of man’s intellectual activities involve language, a full mechanical language processing capability would seem to imply competence in most aspects of human intelligence.


Natural Language Natural Language Processing Semantic Network Parse Tree Primitive Action 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. McCalla, Gordon L, and Sampson, Jeffrey R. “MUSE: A Model to Understand Simple English.” Communications of the ACM, 15, 1972, pp. 29–40. An improved version of Quillian’s TLC system.CrossRefGoogle Scholar
  2. Minsky, Marvin (ed.). Semantic Information Processing. MIT Press, 1968. A collection of early papers, mostly abridged versions of MIT Ph.D. theses. Many important early systems are described, including Raphael’s SIR and Quillian’s thesis model.MATHGoogle Scholar
  3. Quillian, M. Ross. “The Teachable Language Comprehender: A simulation program and theory of language.” Communications of the ACM, 12, 1969, pp. 459–476. The basic description of TLC.CrossRefGoogle Scholar
  4. Schank, Roger C. “Conceptual Dependency: A theory of natural language understanding.” Cognitive Psychology, 3, 1972, pp. 552–631. A thorough discussion of Schank’s theory at an intermediate stage of its development.CrossRefGoogle Scholar
  5. Schank, Roger C. The Fourteen Primitive Actions and their Inferences. Stanford Artificial Intelligence Memo AIM-183, March, 1973. A full discussion of the role of ACTs in Conceptual Dependency Theory.Google Scholar
  6. Schank, Roger C., and Colby, Kenneth M. (eds.), Computer Models of Thought and Language. Freeman, 1973. A recent collection of papers, including major contributions by Schank, Winograd, and other workers in the field.MATHGoogle Scholar
  7. Turing, A. M., “Computing machinery and intelligence” Mind, 59, 1950, pp. 433–460. Reprinted in the Feigenbaum and Feldman collection (see the Chapter 12 Bibliography). The Turing test, proposed and defended.MathSciNetCrossRefGoogle Scholar
  8. Weizenbaum, Joseph. “ELIZA—A computer program for the study of natural language communication between man and machine.” Communications of the ACM, 9, 1966, pp. 36–45. The original description of ELIZA.CrossRefGoogle Scholar
  9. Winograd, Terry. “Understanding natural language.” Cognitive Psychology, 3, 1972, pp. 1–191. The entire issue devoted to a condensation of Winograd’s thesis. The same material has been published as a book (Understanding Natural Language, 1972) by Academic Press.CrossRefGoogle Scholar
  10. Winograd, Terry. Five Lectures on Artificial Intelligence. Stanford Artificial Intelligence Memo AIM-246, September, 1974.Google Scholar

Copyright information

© Springer-Verlag New York Inc. 1976

Authors and Affiliations

  • Jeffrey R. Sampson
    • 1
  1. 1.Department of Computing ScienceThe University of AlbertaEdmontonCanada

Personalised recommendations