Batch classifications with discrete finite mixtures

  • Petri Kontkanen
  • Petri Myllymäki
  • Tomi Silander
  • Henry Tirri
Multiple Models for Classification
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1398)


In this paper we study batch classification problems where multiple predictions are made simultaneously, in contrast to the standard independent classification case, where the predictions axe made independently one at a time. The main contribution of this paper is to demonstrate how the standard EM algorithm for finite mixture models can be modified for the batch classification case. In the empirical part of the paper, the results obtained by the batch classification approach are compared to those obtained by independent predictions.


Expectation Maximization Expectation Maximization Algorithm Test Vector Finite Mixture Finite Mixture Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    A.P. Dempster, N.M. Laird, and D.B. Rubin. Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, 39(1):1–38, 1977.Google Scholar
  2. 2.
    W. Emde. Inductive learning of characteristic concept descriptions from small sets of classified examples. In F. Bergadano and L. De Raedt, editors, Proceedings of the 7th European Conference on Machine Learning (ECML94), pages 103–121, 1994.Google Scholar
  3. 3.
    B.S. Everitt and D.J. Hand. Finite Mixture Distributions. Chapman and Hall, London, 1981.Google Scholar
  4. 4.
    P. Kontkanen, P. Myllymäki, T. Silander, H. Tirri, and P. Grünwald. Comparing predictive inference methods for discrete Grünwald. In Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, pages 311–318, Ft. Lauderdale, Florida, January 1997. Also: NeuroCOLT Technical Report NC-TR97-004.Google Scholar
  5. 5.
    P. Kontkanen, P. Myllymäki, and H. Tirri. Constructing Bayesian finite mixture models by the EM algorithm. Technical Report NC-TR-97-003, ESPRIT Working Group 8556: Neural and Computational Learning (NeuroCOLT), 1996.Google Scholar
  6. 6.
    H. Tirri, P. Kontkanen, and P. Myllymäki. Probabilistic instance-based learning. In L. Saitta, editor, Machine Learning: Proceedings of the Thirteenth International Conference, pages 507–515. Morgan Kaufmann Publishers, 1996.Google Scholar
  7. 7.
    D.M. Titterington, A.F.M. Smith, and U.E. Makov. Statistical Analysis of Finite Mixture Distributions. John Wiley & Sons, New York, 1985.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Petri Kontkanen
    • 1
  • Petri Myllymäki
    • 1
  • Tomi Silander
    • 1
  • Henry Tirri
    • 1
  1. 1.Department of Computer ScienceComplex Systems Computation Group (CoSCO)Finland

Personalised recommendations