Advertisement

Land Use Classification via Multispectral Information

  • Cem Ünsalan
  • Kim L. Boyer
Part of the Advances in Computer Vision and Pattern Recognition book series (ACVPR)

Abstract

In the previous chapter, our land use classification approach was based on the organization of straight lines (structure) in panchromatic images. It is well-known that multispectral information also offers a great deal of information for land use classification. This chapter describes an approach to combining structural information, obtained from 1m panchromatic Ikonos images with spectral information, obtained from the corresponding 4 m multispectral images with application to identifying areas of significant land development. There are several contributions in the literature in which spatial and spectral features have been combined in land use classification and related problems. However, none to date use the line support region structural feature, as we do. Finally, we introduce additional spatial information, over a broader area than the structural information captured in the line support regions, by means of probabilistic relaxation. Although relaxation improves classification slightly, the improvement comes at substantial computational cost. Therefore, we recommend that this approach be used only in applications where the improvement is absolutely necessary.

Keywords

Classification Performance Vegetation Index Urban Region Near Neighbor Label Probability 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    J.R.G. Townshend, C. Justice, W. Li, C. Gurney, J. McManus, Remote Sens. Environ.35, 243 (1991) CrossRefGoogle Scholar
  2. 2.
    V. Caselles, E. Valor, Remote Sens. Environ.57, 167 (1996) CrossRefGoogle Scholar
  3. 3.
    J. Symnazik, R.R. Griffiths, L. Gillies, inProceedings of the Statistical Computing Section and Section on Statistical Graphics (2000), pp. 10–19 Google Scholar
  4. 4.
    Y.J. Kaufman, L.A. Remer, IEEE Trans. Geosci. Remote Sens.32(3), 672 (1994) CrossRefGoogle Scholar
  5. 5.
    T. Fung, inProceedings of IGARSS, vol. 2 (1997), pp. 836–838 Google Scholar
  6. 6.
    J.M.C. Pereira, IEEE Trans. Geosci. Remote Sens.37(1), 217 (1999) CrossRefGoogle Scholar
  7. 7.
    P.R. Coppin, M.E. Bauer, IEEE Trans. Geosci. Remote Sens.32(4), 918 (1994) CrossRefGoogle Scholar
  8. 8.
    D.F. Lozano-Garcia, R.N. Fernández, C.J. Johannsen, IEEE Trans. Geosci. Remote Sens.29(2), 331 (1991) CrossRefGoogle Scholar
  9. 9.
    T. Ishiyama, S. Tanaka, K. Uchida, S. Fujikawa, Y. Yamsahita, M. Kato, Adv. Space Res.28(1), 183 (2001) CrossRefGoogle Scholar
  10. 10.
    P.M. Teillet, K. Staenz, D.J. Willimas, Remote Sens. Environ.61, 139 (1997) CrossRefGoogle Scholar
  11. 11.
    R.L. Kettig, D.A. Landgrebe, IEEE Trans. Geosci. Electron.14(1), 19 (1976) CrossRefGoogle Scholar
  12. 12.
    A. Rosenfeld, R.A. Hummel, S.W. Zucker, IEEE Trans. Syst. Man Cybern.6, 420 (1973) MathSciNetCrossRefGoogle Scholar
  13. 13.
    H. Yamamoto, Comput. Vis. Graph. Image Process.10, 256 (1979) CrossRefGoogle Scholar
  14. 14.
    C. Ünsalan, K.L. Boyer, IEEE Trans. Geosci. Remote Sens.42(12), 2840 (2004) CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2011

Authors and Affiliations

  1. 1.Electrical and Electronics EngineeringYeditepe UniversityKayisdagiTurkey
  2. 2.Dept. Electrical, Comp. & Systems Eng.Rensselaer Polytechnic InstituteTroyUSA

Personalised recommendations