• J. W. Lloyd
Part of the Cognitive Technologies book series (COGTECH)


In this chapter, the application of the logic to knowledge representation is studied. The main idea is the identification of a class of terms, called basic terms, suitable for representing individuals in diverse applications. For example, this class is suitable for machine-learning applications. From a (higherorder) programming language perspective, basic terms are data values. The most interesting aspect of the class of basic terms is that it includes certain abstractions and therefore is wider than is normally considered for knowledge representation. These abstractions allow one to model sets, multisets, and data of similar types, in a direct way. Of course, there are other ways of introducing (extensional) sets, multisets, and so on, without using abstractions. For example, one can define abstract data types or one can introduce data constructors with special equality theories. The primary advantage of the approach adopted here is that one can define these abstractions intensionally, as shown in Chap. 5. Techniques for defining metrics and kernels on basic terms are also investigated in this chapter.


Induction Hypothesis Product Kernel Normal Term Basic Term Type Constructor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliographical Notes

  1. 55.
    J.W. Lloyd. Knowledge representation, computation, and learning in higher-order logic.“jwl, 2001.Google Scholar
  2. 56.
    J.W. Lloyd. Higher-order computational logic. In A. Kakas and F. Sadri, editors, Computational Logic: Logic Programming and Beyond, pages 105–137. Springer, LNAI 2407, 2002. Essays in Honour of Robert A. Kowalski, Part I.Google Scholar
  3. 57.
    J.W. Lloyd. Predicate construction in higher-order logic. Elec- tronic Transactions on Artificial Intelligence, 4 (2000): 21–51.MathSciNetGoogle Scholar
  4. 54.
    J.W. Lloyd. Programming in an integrated functional and logic language. Journal of Functional and Logic Programming, 1999 (3), March 1999.Google Scholar
  5. 7.
    A.F. Bowers, C. Giraud-Carrier, and J.W. Lloyd. Classification of individuals with complex structure. In P. Langley, editor, Machine Learning: Proceedings of the Seventeenth International Conference (ICML2000), pages 81-88. Morgan Kaufmann, 2000.Google Scholar
  6. 61.
    T.M. Mitchell. Machine Learning. McGraw-Hill, 1997.Google Scholar
  7. 84.
    B. Schölkopf and A. Smola. Learning with Kernels. MIT Press, 2002.Google Scholar
  8. 15.
    N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge University Press, 2000.Google Scholar
  9. 83.
    B. Schölkopf, C. Burges, and A. Smola, editors. Advances in Kernel Methods: Support Vector Learning. MIT press, 1998.Google Scholar
  10. 29.
    T. Gärtner, J.W. Lloyd, and P. Flach. Kernels for structured data. In S. Matwin and C. Sammut, editors, Inductive Logic Programming, 12th International Conference, ILP2002, Lecture Notes in Computer Science 2583, pages 66–83. Springer, 2003.Google Scholar
  11. 33.
    D. Haussler. Convolution kernels on discrete structures. Technical Report UCSC-CRL-99–10, University of California in Santa Cruz, Department of Computer Science, 1999.Google Scholar

Copyright information

© J. W. Lloyd 2003

Authors and Affiliations

  • J. W. Lloyd
    • 1
  1. 1.Research School of Information Sciences and Engineering, Computer Sciences LaboratoryThe Australian National UniversityCanberraAustralia

Personalised recommendations