Advertisement

Machine Learning and Data Mining

  • Wolfgang Ertel
Part of the Undergraduate Topics in Computer Science book series (UTICS)

Abstract

One of the major AI applications is the development of intelligent autonomous robots. Since flexibility and adaptivity are important features of really intelligent agents, research into learning mechanisms and the development of machine learning algorithms is one of the most important branches of AI. After motivating and introducing basic concepts of machine learning like classification and approximation, this chapter presents basic supervised learning algorithms such as the perceptron, nearest neighbor methods and decision tree induction. Unsupervised clustering methods and data mining software tools complete the picture of this fascinating field.

Keywords

Decision Tree Training Data Learning Algorithm Bayesian Network Voronoi Diagram 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. [Alp04]
    E. Alpaydin. Introduction to Machine Learning. MIT Press, Cambridge, 2004. Google Scholar
  2. [BCDS08]
    A. Billard, S. Calinon, R. Dillmann, and S. Schaal. Robot programming by demonstration. In B. Siciliano and O. Khatib, editors, Handbook of Robotics, pages 1371–1394. Springer, Berlin, 2008. CrossRefGoogle Scholar
  3. [BFOS84]
    L. Breiman, J. Friedman, R. A. Olshen, and C. J. Stone. Classification and Regression Trees. Wadsworth, Belmont, 1984. MATHGoogle Scholar
  4. [Bis06]
    C. M. Bishop. Pattern Recognition and Machine Learning. Springer, New York, 2006. MATHGoogle Scholar
  5. [Bra01]
    B. Brabec. Computergestützte regionale Lawinenprognose. PhD thesis, ETH Zürich, 2001. Google Scholar
  6. [Cle79]
    W. S. Cleveland. Robust locally weighted regression and smoothing scatterplots. J. Am. Stat. Assoc., 74(368):829–836, 1979. CrossRefMATHMathSciNetGoogle Scholar
  7. [DHS01]
    R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. Wiley, New York, 2001. MATHGoogle Scholar
  8. [DNM98]
    C. L. Blake, D. J. Newman, S. Hettich and C. J. Merz. UCI repository of machine learning databases, 1998. http://www.ics.uci.edu/~mlearn/MLRepository.html.
  9. [ES99]
    W. Ertel and M. Schramm. Combining data and knowledge by MaxEnt-optimization of probability distributions. In PKDD’99 (3rd European Conference on Principles and Practice of Knowledge Discovery in Databases). LNCS, volume 1704, pages 323–328. Springer, Prague, 1999. CrossRefGoogle Scholar
  10. [HTF09]
    T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, Berlin, 3rd edition, 2009. Online version: http://www-stat.stanford.edu/~tibs/ElemStatLearn/. MATHGoogle Scholar
  11. [Jen01]
    F. V. Jensen. Bayesian Networks and Decision Graphs. Springer, Berlin, 2001. MATHGoogle Scholar
  12. [Jor99]
    M. I. Jordan, editor. Learning in Graphical Models. MIT Press, Cambridge, 1999. Google Scholar
  13. [MDBM00]
    G. Melancon, I. Dutour, and G. Bousque-Melou. Random generation of dags for graph drawing. Technical Report INS-R0005, Dutch Research Center for Mathematical and Computer Science (CWI), 2000. http://ftp.cwi.nl/CWIreports/INS/INS-R0005.pdf.
  14. [Mit97]
    T. Mitchell. Machine Learning. McGraw–Hill, New York, 1997. www-2.cs.cmu.edu/~tom/mlbook.html. Google Scholar
  15. [MP69]
    M. Minsky and S. Papert. Perceptrons. MIT Press, Cambridge, 1969. MATHGoogle Scholar
  16. [Qui93]
    J. Ross Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, 1993. C4.5 download: http://www.rulequest.com/Personal, C5.0 download: http://www.rulequest.com/download.html. Google Scholar
  17. [RGH+06]
    M. Riedmiller, T. Gabel, R. Hafner, S. Lange, and M. Lauer. Die Brainstormers: Entwurfsprinzipien lernfähiger autonomer Roboter. Inform.-Spektrum, 29(3):175–190, 2006. CrossRefGoogle Scholar
  18. [Ric83]
    E. Rich. Artificial Intelligence. McGraw–Hill, New York, 1983. MATHGoogle Scholar
  19. [Ros58]
    F. Rosenblatt. The perceptron: a probabilistic model for information storage and organization in the brain. Psychol. Rev., 65:386–408, 1958. Reprint in [AR88], pages 92–114. CrossRefMathSciNetGoogle Scholar
  20. [RW06]
    C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, Cambridge, 2006. Online version: http://www.gaussianprocess.org/gpml/chapters/. MATHGoogle Scholar
  21. [SA94]
    S. Schaal and C. G. Atkeson. Robot juggling: implementation of memory-based learning. IEEE Control Syst. Mag., 14(1):57–71, 1994. CrossRefGoogle Scholar
  22. [Sch04]
    A. Schwartz. SpamAssassin. O’Reilly, Cambridge, 2004. SpamAssassin-Homepage: http://spamassassin.apache.org. Google Scholar
  23. [SE00]
    M. Schramm and W. Ertel. Reasoning with probabilities and maximum entropy: the system PIT and its application in LEXMED. In K. Inderfurth et al., editor, Operations Research Proceedings (SOR’99), pages 274–280. Springer, Berlin, 2000. Google Scholar
  24. [SE10]
    M. Schneider and W. Ertel. Robot learning by demonstration with local gaussian process regression. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’10), 2010. Google Scholar
  25. [SW76]
    C. E. Shannon and W. Weaver. Mathematische Grundlagen der Informationstheorie. Oldenbourg, Munich, 1976. MATHGoogle Scholar
  26. [Tur50]
    A. M. Turing. Computing machinery and intelligence. Mind, 59:433–460, 1950. CrossRefMathSciNetGoogle Scholar
  27. [WF01]
    I. Witten and E. Frank. Data Mining. Hanser, Munich, 2001. DataMining Java Library WEKA: www.cs.waikato.ac.nz/~ml/weka. MATHGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2011

Authors and Affiliations

  1. 1.FB Elektrotechnik und InformatikHochschule Ravensburg-Weingarten, University of Applied SciencesWeingartenGermany

Personalised recommendations