Multivariate Direction Scoring for Dimensionality Reduction in Classification Problems

  • Giorgio Biagetti
  • Paolo CrippaEmail author
  • Laura Falaschetti
  • Simone Orcioni
  • Claudio Turchetti
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 56)


Dimensionality reduction is the process of reducing the number of features in a data set. In a classification problem, the proposed formula allows to sort a set of directions to be used for data projection, according to a score that estimates their capability of discriminating the different data classes. A reduction in the number of features can be obtained by taking a subset of these directions and projecting data on this space. The projecting vectors can be derived from a spectral representation or other choices. If the vectors are eigenvectors of the data covariance matrix, the proposed score is aimed to take the place of the eigenvalues in eigenvector ordering.


Principal Component Analysis Dimensionality Reduction Linear Discriminant Analysis Classification Problem Speaker Identification 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Alcalá-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: KEEL data-mining software tool: Data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17(2–3), 255–287 (2011)Google Scholar
  2. 2.
    Biagetti, G., Crippa, P., Curzi, A., Orcioni, S., Turchetti, C.: A multi-class ECG beat classifier based on the truncated KLT representation. In: Proceedings of UKSim-AMSS 8th European Modelling Symposium on Computer Modelling and Simulation, EMS 2014, pp. 93–98. IEEE Computer Society (2014)Google Scholar
  3. 3.
    Biagetti, G., Crippa, P., Curzi, A., Orcioni, S., Turchetti, C.: Speaker identification with short sequences of speech frames. In: Proceedings of the 4th International Conference on Pattern Recognition Applications and Methods (ICPRAM 2015), pp. 178–185. SCITEPRESS (2015)Google Scholar
  4. 4.
    Fisher, R.A.: The use of multiple measurements in taxonomic problems. Ann. Eugenics 7(2), 179–188 (1936)CrossRefGoogle Scholar
  5. 5.
    Frisch, R.: Correlation and scatter in statistical variables. Nordic Stat. J. 36–102 (1929)Google Scholar
  6. 6.
    Hotelling, H.: Analysis of a complex of statistical variables into principal components. J. Educ. Psychol. 24(6), 417–441 (1933)CrossRefzbMATHGoogle Scholar
  7. 7.
    Kirby, M., Sirovich, L.: Application of the Karhunen-Loève procedure for the characterization of human faces. IEEE Trans. Pattern Anal. Mach. Intell. 12(1), 103–108 (1990)Google Scholar
  8. 8.
    Martínez, A.M., Kak, A.C.: PCA versus LDA. IEEE Trans. Pattern Anal. Mach. Intell. 23(2) (2001)Google Scholar
  9. 9.
    Moghaddam, B., Pentland, A.: Probabilistic visual learning for object representation. IEEE Trans. Pattern Anal. Mach. Intell. 19(7), 696–710 (1997)Google Scholar
  10. 10.
    Rao, C.R.: The utilization of multiple measurements in problems of biological classification. J. Royal Stat. Soc. Ser. B (Methodological) 10(2), 159–203 (1948)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Turk, M., Pentland, A.: Eigenfaces for recognition. J. Cognit. Neurosci. 3(1), 71–86 (1991)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Giorgio Biagetti
    • 1
  • Paolo Crippa
    • 1
    Email author
  • Laura Falaschetti
    • 1
  • Simone Orcioni
    • 1
  • Claudio Turchetti
    • 1
  1. 1.DII—Dipartimento di Ingegneria dell’InformazioneUniversità Politecnica delle MarcheAnconaItaly

Personalised recommendations