Skip to main content

Mathematical Programming Approach to a Minimax Theorem of Statistical Discrimination Applicable to Pattern Recognition

  • Chapter
Optimization Techniques IFIP Technical Conference

Part of the book series: Lecture Notes in Computer Science ((LNCIS))

Abstract

Let k j’s be two famerlies of all possible p-variate distribution functions with specified mean vectors μ i and non-degenerate variance-covariance matrices ∑i, and πi be prior probability or weight assigned to k i for i=1, 21 + π2 = 1). We are supposed to discriminate whether an observation x is from a (true) distribution F 1k 1 or F 2k 2 A randomized decision rule is represented by a pair of functions Φ1 (x) and \({\phi _2}(\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{x} ) = 1 - {\phi _1}(\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{x} )\;(0\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{ \leqslant } {\phi _1}(\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{x} )\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{ \leqslant } 1) \), based on which one decides, with probability øi x, that an observed value x. is a sample from some F i in k i (i=1,2). If the pair F = (F 1, F 2) is known, the error probability or classification error for the decision rule ø = (ø1, ø2) is clearly given by

$$e(\phi ,F) = {\pi _1}\int_{{R^p}} {\phi (\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{x} )d{F_1}(\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{x} ) + {\pi _2}\int_{{R^p}} {{\phi _1}(\underset{\raise0.3em\hbox{$\smash{\scriptscriptstyle-}$}}{x} )d{F_2}(x)} } $$
((1.1))

.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Chernoff, H. (1971), “A bound on the classification error for discriminating between populations with specified means and variances,” Studi di probabilità, etatistica e ricevca operativa in onore di Giuseppe Pompilj.

    Google Scholar 

  2. Isii, K. (1964), “Inequalities of the types of Chebyshev and Cramér-Rao and mathematical programming” Ann. Inst, Statist. Math., 16, 277–293.

    Article  MathSciNet  Google Scholar 

  3. Isii, K. (1969-), “Lecture Notes on Optimization theory and its Applications (unpublished).”

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1975 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Isii, K., Taga, Y. (1975). Mathematical Programming Approach to a Minimax Theorem of Statistical Discrimination Applicable to Pattern Recognition. In: Marchuk, G.I. (eds) Optimization Techniques IFIP Technical Conference. Lecture Notes in Computer Science. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-38527-2_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-38527-2_48

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-37713-0

  • Online ISBN: 978-3-662-38527-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics