Skip to main content

Classification

  • Chapter
  • First Online:
  • 1558 Accesses

Abstract

The final step in performing biometric recognition after preprocessing the data with normalization and feature extraction involves the application of a classifier to perform recognition over the transformed sample space. Although classification could theoretically be applied directly to the sample set with no prior preprocessing, we opted against doing so, because, in addition to exposing features to the classifier that might otherwise be missed by overfitting, the use of our preprocessors also reduced the dimensionality of the samples’ feature space to a size that could be effectively handled by all popular classifiers. Without needing to take computational efficiency into account, the goal of the classifier then became to find the boundaries in the transformed sample space that best separate the feature spaces of the different subjects. Internally, classifiers use a number of tricks to discover these boundaries; however, classifiers are also subject to initialization parameters that can be adjusted to improve recognition performance. Knowing this, our classification goal, for any given classifier, can be addressed by solving an optimization problem; namely, discovering the classification parameters that optimize recognition performance .

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Agapitos, Alexandros, O’Neill Michael, and Anthony Brabazon. 2011. Maximum margin decision surfaces increased generalisation in evolutionary decision tree learning. In Genetic programming: 14th european conference, 61–72, Torino.

    Google Scholar 

  2. Balachander, Thiagarajan, Ravi Kothari, and Hernani Cualing. 1997. An empirical comparison of dimensionality reduction techniques for pattern classification. In Artificial neural networks—7th international conference, 589–594, Lausanne.

    Google Scholar 

  3. Bottou, Léon, and Chih-Jen Lin. 2007. Support vector machine solvers. In Large-scale kernel machines. Bottou Léon et al., eds. chap. 1. 1–28. Cambridge, MA, USA: MIT Press.

    Google Scholar 

  4. Christopher J. C. Burges, 1998. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2): 121–167.

    Google Scholar 

  5. Cattin, Philippe C. 2002. Biometric authentication system using human gait. Ph.D Thesis. Zurich, Switzerland: Swiss Federal Institute of Technology.

    Google Scholar 

  6. Chang, Chih-Chung, and Chih-Jen Lin. 2001. LIBSVM: A library for support vector machines. Technical Report. Taipei: National Taiwan University, http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf.

  7. Chen, Pai-Hsuen, Chih-Jen Lin, and Bernhard Schölkopf. 2005. A tutorial on v-support vector machines. Appl Stochastic Models Bus Ind 21: 111–136.

    Google Scholar 

  8. Fernández-Redondo, Mercedes, and Carlos Hernández-Espinosa. 2001. Weight initialization methods for multilayer feedforward. In European symposium on artificial neural networks. 119–124. Bruges.

    Google Scholar 

  9. Friedman, Jerome H, 1989. Regularized discriminant analysis. Journal of the American Statistical Association 84(405): 165–175.

    Google Scholar 

  10. Hachiya, Hirotaka, Masashi Sugiyama, and Naonori Ueda. 2012. Importance-weighted least squares probabilistic classifier for covariate shift adaptation with application to human activity recognition. Neurocomputing 80: 93–101.

    Google Scholar 

  11. Hajek, Milan. 2005. Models of a neuron. In Neural networks. Durban, South Africa: University of KwaZulu-Natal, 9–10.

    Google Scholar 

  12. Hastie, Trevor, Andreas Buja, and Robert Tibshirani. 1995. Penalized discriminant analysis. The Annals of Statistics 23(1): 73–102.

    Google Scholar 

  13. Hastie, Trevor, Robert Tibshirani, and Andreas Buja. 1994. Flexible discriminant analysis by optimal scoring. Journal of the American Statistical Association 89(428): 1255–1270.

    Google Scholar 

  14. Heaton Jeff. 2014. Encog machine learning framework. http://www.heatonresearch.com/encog.

  15. Hicklin Joe et al. 2012. Jama: A java matrix package. http://math.nist.gov/javanumerics/jama/.

  16. Hidayat Erwin, Nur A. Fajrian, Azah Kamilah Muda, Choo Yun Huoy, and Sabrina Ahmad. 2011. A comparative study of feature extraction using PCA and LDA for face recognition. In 7th international conference on information assurance and security (IAS). 354–359. Melaka.

    Google Scholar 

  17. Huang Rui, Qingshan Liu, Hanqing Lu, and Songde Ma. 2002. Solving the small sample size problem of LDA. In 16th international conference on pattern recognition. 29–32. Quebec City.

    Google Scholar 

  18. Keller James M., Michael R. Gray, and James A. Givens, Jr. 1985. A fuzzy K-nearest neighbor algorithm. In IEEE transactions on systems, man, and cybernetics. SMC-15(4) 580–585.

    Google Scholar 

  19. Kozen Dexter and Marc Timme. 2007. Idefinite summation and the Kronecker delta. http://dspace.library.cornell.edu/bitstream/1813/8352/2/Kronecker.pdf.

  20. Lawrence Steve, C. Lee Giles, and Ah Chung Tsoi. 1997. Lessons in neural network training: overfitting may be harder than expected. In Fourtheenth national conference on artificial intelligence. 540–545. Manlo Park.

    Google Scholar 

  21. LeCun Yann, and Yoshua Bengio. 1995. Pattern recognition. In The handbook of brain theory and neural networks, Michael A. Arbib, ed.: A bradford book. 864–868.

    Google Scholar 

  22. Lin, Hsuan-Tien, Chih-Jen Lin, and Ruby C. Weng. 2007. A note on platt’s probabilistic outputs for support vector machines. Machine Learning 68(3): 267–276.

    Google Scholar 

  23. Liu, Ke, Yong-Qing Cheng, and Jing-Yu. Yang. 1992. A generalized optimal set of discriminant vectors. Pattern Recognition 25(7): 731–739.

    Google Scholar 

  24. Liu Wei, Yunhong Wang, Stan Z. Li, and Tieniu Tan. 2004. Null space approach of fisher discriminant analysis for face recognition. In European conference on computer vision, biometric authentication workshop. 32–44, Prague.

    Google Scholar 

  25. Milgram, Jonathan, Mohamed Cheriet, and Robert Sabourin. 2006. One against one or one against all: which one is better for handwriting recognition with SVMs?. In Tenth international workshop on frontiers in handwriting recognition, La Baule.

    Google Scholar 

  26. Mostayed Ahmed, Sikyung Kim, Mohammad Mynuddin Gani Mazumder, and Se Jin Park. 2008. Foot step based person identification using histogram similarity and wavelet decomposition. In International conference on information security and assurance. 307–311. Busan.

    Google Scholar 

  27. Moustakidis, Serafeim P., John B. Theocharis, and Giannis Giakas. 2008. Subject recognition based on ground reaction force measurements of gait signals. IEEE Transactions on Systems, Man, and Cybernetics-Part B: Cybernetics 38(6): 1476–1485.

    Google Scholar 

  28. Orfanidis Sophocles J. 2007. SVD, PCA, KLT, CCA, and All That.

    Google Scholar 

  29. Orr Robert J., and Gregory D. Abowd. 2000. The smart floor: A mechanism for natural user identification and tracking. In CHI ‘00 conference on human factors in computer systems. 275–276. The Hague.

    Google Scholar 

  30. Park, Cheong Hee, and Haesun Park. 2005. Nonlinear discriminant analysis using kernel functions and the generalized singular value decomposition. SIAM Journal on Matrix Analysis and Applications 27(1): 87–102.

    Google Scholar 

  31. Phyu, Thair Nu. 2009. Survey of classification techniques in data mining. In International multiconference of engineers and computer scientists. I IMECS 2009. 727−731. Hong Kong.

    Google Scholar 

  32. Principe, Jose C., Neil R. Euliano, and W. Curt Lefebvre. 1999. Multilayer perceptron. In Neural and adaptive systems: Fundamentals through simulation. New York, United States: Wiley, ch. 3, 100–172.

    Google Scholar 

  33. Rodríguez, Rubén Vera, Nicholas W. D. Evans, Richard P. Lewis, Benoit Fauve, and John S. D. Mason. 2007. An experimental study on the feasibility of footsteps as a biometric. In 15th European signal processing conference (EUSIPCO 2007). 748–752. Poznan.

    Google Scholar 

  34. Rodríguez, Rubén Vera, John S.D. Mason, and Nicholas W.D. Evans. 2008. Footstep recognition for a smart home environment. International Journal of Smart Home 2(2): 95–110.

    Google Scholar 

  35. Rojas, Raúl. 1996. The backpropagation algorithm. In Neural networks. ch. 7. 149–180. Berlin, Germany: Springer.

    Google Scholar 

  36. Rojas, Raúl. 1996. The biological paradigm. In Neural networks. ch. 1. 3–26. Berlin, Germany: Springer.

    Google Scholar 

  37. Rynkiewicz, Joseph, 2012. General bound of overfitting for MLP regression models. Neurocomputing 90: 106–110.

    Google Scholar 

  38. Selormey, Paul. 2004. DotNetMatrix: Simple matrix library for.NET. http://www.codeproject.com/Articles/5835/DotNetMatrix-Simple-Matrix-Library-for-NET.

  39. Srivastava, Santosh, Maya R. Gupta, and Béla A. Frigyik. 2007. Bayesian quadratic discriminant analysis. Journal of Machine Learning Research 8: 1277–1305.

    Google Scholar 

  40. Sugiyama, Masashi. 2010. Superfast-trainable multi-class probabilistic classifier by least-squares posterior fitting. IEICE Transaction on Information and Systems. E93-D : 10 2690–2701.

    Google Scholar 

  41. Suutala, Jaakko, and Juha Röning. 2008. Methods for person identification on a pressure-sensitive floor: Experiments with multiple classifiers and reject option. Information Fusion Journal, Special Issue on Applications of Ensemble Methods 9. 9(1) 21–40.

    Google Scholar 

  42. Wang, Ling, Liefeng Bo, and Licheng Jiao. 2006. Kernel uncorrelated discriminant analysis for radar target recognition. In 13th international conference on neural information processing. 404–411. Hong Kong.

    Google Scholar 

  43. Wettschereck, Dietrich. 1994. A study of distance-based machine learning algorithms, Ph.D Thesis. Oregon State University, Corvallis, OR, USA.

    Google Scholar 

  44. What is the generalized inverse of a matrix?. http://artsci.wustl.edu/~jgill/papers/ginv.pdf.

  45. Wu, Ting-Fan, Chih-Jen Lin, and Ruby C. Weng. 2004. Probability estimates for multi-class classification by pairwise coupling. Journal of Machine Learning Research 5: 975–1005.

    Google Scholar 

  46. Ye, Jieping, 2005. Characterization of a family of algorithms for generalized discriminant analysis on under sampled problems. Journal of Machine Learning Research 6: 483–502.

    Google Scholar 

  47. Ye, Jieping, et al. 2006. Efficient model selection for regularized linear discriminant analysis. In 15th ACM international conference on information and knowledge management. 532–539. New York.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to James Eric Mason .

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Mason, J.E., Traoré, I., Woungang, I. (2016). Classification. In: Machine Learning Techniques for Gait Biometric Recognition. Springer, Cham. https://doi.org/10.1007/978-3-319-29088-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-29088-1_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-29086-7

  • Online ISBN: 978-3-319-29088-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics