Abstract
A key factor in classifiers based on the normal (or Gaussian) distribution is the modeling of covariance matrices. When the number of available training pixels is limited, as often is the case in hyperspectral image classification, it is necessary to limit the complexity of these covariance models. An alternative to reducing the complexity uniformly over the whole feature space, is to form orthogonal subspaces and reduce the model complexity within them separately, e.g., forming full-complexity within-class, or interior-class, subspace models, and reduced-complexity exterior-class subspace models. We propose to use subspaces created by forming fewer and wider spectral bands, instead of the more general principal component analysis transform (PCA), in an attempt to exploit a-priori knowledge of the data to create more generalizable subspaces. We investigate the resulting classifiers by studying their performances on four hyperspectral data sets. On each data set, experiments where run using different training set sizes. The results indicate that the classifiers seem to benefit from using this more data-specific approach to forming subspaces.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley Interscience, Hoboken (2000)
Frank, I.E.: Dasco: a new classification method. Chemometrics and Intelligent Laboratory Systems 4(3), 215–222 (1988)
Gamba, P.: A collection of data for urban area characterization. In: Proc. IEEE Geoscience and Remote Sensing Symposium (IGARSS 2004), pp. 69–72 (2004)
Ham, J., Chen, Y., Crawford, M.M., Ghosh, J.: Investigation of the random forest framework for classification of hyperspectral data. IEEE Trans. Geosci. Remote Sensing 43(3), 492–501 (2005)
Jensen, A.C., Solberg, A.S.: Fast hyperspectral feature reduction using piecewise constant function approximations. IEEE Geoscience and Remote Sensing Letters 4(4), 547–551 (2007)
Landgrebe, D.A.: Signal Theory Methods in Multispectral Remote Sensing. Wiley Interscience, Hoboken (2003)
Næs, T., Indahl, U.: A unified description of classical classification methods for multicollinear data. Journal of chemometrics 12(3), 205–220 (1998)
Wold, S.: Pattern recognition by means of disjoint principal components models. Pattern Recognition 8(3), 127–139 (1976)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jensen, A.C., Loog, M. (2011). Forming Different-Complexity Covariance-Model Subspaces through Piecewise-Constant Spectra for Hyperspectral Image Classification. In: Heyden, A., Kahl, F. (eds) Image Analysis. SCIA 2011. Lecture Notes in Computer Science, vol 6688. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21227-7_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-21227-7_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21226-0
Online ISBN: 978-3-642-21227-7
eBook Packages: Computer ScienceComputer Science (R0)