Skip to main content

Boosted Classifiers for Antitank Mine Detection in C-Scans from Ground-Penetrating Radar

  • Chapter
  • First Online:

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 342))

Abstract

We investigate the problem of automatic antitank mine detection. Subject to detection are 3D images, so-called C-scans, generated by a GPR (Ground-Penetrating Radar) system of our construction. In the paper we focus on boosting as a machine learning approach well suited for large-scale data such as GPR data. We compare five variants of weak learners with real-valued responses trained by the same boosting scheme. Three of the variants are single-feature-based learners that differ in the way they approximate class conditional distributions. The two remaining variants are shallow decision trees, respectively, with four and eight terminal nodes, introducing joint-feature conditionals.

This project has been partially financed by the Polish Ministry of Science and Higher Education; agreement no. 0091/R/TOO/2010/12 for R&D project no. 0 R00 0091 12, dated on 30.11.2010, carried out by a consortium of Military Institute of Armament Technology in Zielonka, and Autocomp Management in Szczecin, Poland.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Area under Receiver Operating Characteristic curve.

  2. 2.

    Yet, the authors in [14, 15] consider quite small sets of about \(20\) features.

  3. 3.

    At the moment of writing this manuscript, we submitted a paper to the IEEE Trans. on Geoscience and Remote Sensing journal, proposing a technique for fast calculation of moments (11) for successive windows via multiple integral images.

  4. 4.

    Even about \(100\) windows for a dense image traversal, e.g., with \(1\) pixel shifts.

References

  1. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth & Brooks, Monterey (1984)

    MATH  Google Scholar 

  2. Cremer, F., et al.: Feature level fusion of polarimetric infrared and GPR data for landmine detection. In: Proceedings of EUDEM2-SCOT 2003, International Conference on Requirements and Technologies for the Detection, Removal and Neutralization of Landmines and UXO, vol. 2, pp. 638–642 (2003)

    Google Scholar 

  3. Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Ann. Stat. 28(2), 337–407 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  4. Frigui, H., et al.: Context-dependent multisensor fusion and its application to land mine detection. IEEE Trans. Geosci. Remote Sens. 48(6), 2528–2543 (2010)

    Article  Google Scholar 

  5. Hamdi, A., Missaoui, O., Frigui, H.: An SVM classifier with HMM-based kernel for landmine detection using ground penetrating radar. In: IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp. 4196–4199 (2010)

    Google Scholar 

  6. Horng, M.H.: Texture feature coding method for texture classification. Opt. Eng. 42(1), 228–238 (2002)

    Article  Google Scholar 

  7. Jol, H.M.: Ground Penetrating Radar: Theory and Applications. Elsevier, Oxford (2009)

    Google Scholar 

  8. Klęsk, P., Godziuk, A., Kapruziak, M., Olech, B.: Landmine detection in 3D images from ground penetrating radar using Haar-like features. In: ICAISC 2013, Zakopane. Lecture Notes in Artificial Intelligence, vol. 7894, pp. 559–567. Springer, Berlin (2013)

    Google Scholar 

  9. Ligthart, E.E., Yarovoy, A.G., Roth, F., Ligthart, L.P.: Landmine detection in high resolution 3D GPR images. In: MIKON’04, pp. 423–426 (2004)

    Google Scholar 

  10. Mease, D., et al.: Boosted classification trees and class probability/quantile estimation. J. Mach. Learn. Res. 8, 409–439 (2007)

    MATH  Google Scholar 

  11. Missaoui, O., Frigui, H., Gader, P.: Land-mine detection with ground-penetrating radar using multistream discrete hidden Markov models. IEEE Trans. Geosci. Remote Sens. 49(6), 2080–2099 (2011)

    Article  Google Scholar 

  12. Rasolzadeh, B., et al.: Response binning: improved weak classifiers for boosting. In: IEEE Intelligent Vehicles Symposium, pp. 344–349 (2006)

    Google Scholar 

  13. Schapire, R.E.: The strength of weak learnability. Mach. Learn. 5, 197–227 (1990)

    Google Scholar 

  14. Shi, Y., et al.: Landmine detection using boosting classifiers with adaptive feature selection. In: IEEE 6th International Workshop on Advanced Ground Penetrating Radar (IWAGPR), pp. 1–5 (2011)

    Google Scholar 

  15. Sun, Y., Li, J.: Adaptive learning approach to landmine detection. IEEE Trans. Aerosp. Electron. Syst. 41(3), 973–985 (2005)

    Article  Google Scholar 

  16. Torrione, P., Collins, L.M.: Texture features for Antitank landmine detection using ground penetrating radar. IEEE Trans. Geosci. Remote Sens. 45(7), 2374–2382 (2007)

    Article  Google Scholar 

  17. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR’2001), pp. 511–518 (2001)

    Google Scholar 

  18. Yarovoy, A.: Landmine and unexploded ordnance detection and classification with ground penetrating radar. In: Jol, H.M. (ed.) Ground Penetrating Radar: Theory and Applications, pp. 445–478. Elsevier, Oxford (2009)

    Google Scholar 

  19. Yarovoy, A.G., Kovalenko, V., Fogar, L.P.: Impact of ground clutter on buried object detection by ground penetrating radar. In: International Geoscience and Remote Sensing Symposium, pp. 755–777 (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Przemysław Klęsk .

Editor information

Editors and Affiliations

Appendix A: Notes on Boosting’s Connection to Logistic Regression

Appendix A: Notes on Boosting’s Connection to Logistic Regression

Following [3], we point out some important properties of boosting in the context of its connection to logistic regression and additive modeling.

Think of the unknown joint probability distribution (population) from which the data is drawn. Let \(p(\mathbf {x},y)\) denote its density function, which can also be expressed as \(p(\mathbf {x},y)=P(y|\mathbf {x})p(\mathbf {x})\). Now, consider the exponential criterion for some classifier function \(F\) defined as an expectation taken with respect to \(p\):

$$\begin{aligned} Q_p(F)=\mathbb {E}_p\bigl (e^{-yF(\mathbf {x})}\bigr )&= \int _\mathbf {x}\sum _{y\in \{-1,1\}}e^{-yF(\mathbf {x})}p(\mathbf {x},y)\,\mathbf {dx}\nonumber \\&=\int _\mathbf {x}\Bigl (P(y{=}{-1}|\mathbf {x})e^{F(\mathbf {x})}{+}P(y{=}1|\mathbf {x})e^{-F(\mathbf {x})}\Bigr )p(\mathbf {x})\,\mathbf {dx}. \end{aligned}$$
(12)

To minimize \(Q_p\) we demand that \(\partial Q_p(F)/\partial F=0\) (in fact it suffices to minimize the inner conditional expectation) and obtain the formula for the minimizer \(F^*(\mathbf {x}) = 1/2 \log \left( P(y{=}1|\mathbf {x})/P(y{=}{-1}|\mathbf {x})\right) \) being half the logit transform, typical for logistic regression. If an algorithm could somehow immediately find the perfect function \(F^*\), then the boosting procedure could be stopped after just one round. In practice, weak learners are rough approximations of \(F^*\) and multiple rounds are needed. Solving \(F^*\) for probability leads to a form of sigmoid:

$$\begin{aligned} P(y=1|\mathbf {x})=e^{2F^*(\mathbf {x})}\Big /\left( 1+e^{2F^*(\mathbf {x})}\right) =1\Big /\left( 1+e^{-2F^*(\mathbf {x})}\right) , \end{aligned}$$
(13)

and again the similarity to logistic regression can be seen. The two minor differences are: the factor of \(2\) in the exponent, and the fact that in the traditional logistic regression one approximates \(F^*\) by a linear model \(F^*(\mathbf {x})\approx a_0+a_1 x_1 + \cdots + a_n x_n\), whereas in boosting it is a linear combination of weak learners, i.e., \(F^*(\mathbf {x})\approx \sum _k f_k(\mathbf {x})\), so in general arbitrary but simple functions, each being possibly a function of all variables.

Think of the error residuals technique, known from regression modeling in general. In an additive model built sequentially, each successive piece of approximation “explains" some part of the target quantity and that part is subtracted from the target, so that future pieces can focus on residuals. Boosting’s reweighing scheme proceeds in an akin manner. Suppose we have fixed a partial model \(F\) and would like to update it, i.e., to have \(F:=F+f\). Recall the data-based reweighing formulas (2). Let us define their population-based counterparts:

$$\begin{aligned} Z=\int _\mathbf {x}\sum _{y\in \{-1,1\}}e^{-yF(\mathbf {x})}p(\mathbf {x},y)\,\mathbf {dx},\qquad w(\mathbf {x},y)=e^{-y F(\mathbf {x)}} p(\mathbf {x},y) / Z. \end{aligned}$$
(14)

Note that \(Z\) works as a normalizer but simultaneously is our exponential criterion value for the model so far—\(Q_p(F)\). Now, consider the criterion at \(F+f\):

$$\begin{aligned} Q_p(F{+}f)&={\int _\mathbf {x}\sum _{y\in \{-1,1\}}e^{-y\left( F(\mathbf {x})+f(\mathbf {x})\right) }p(\mathbf {x},y)\,\mathbf {dx}}\nonumber \\&={\int _\mathbf {x}\sum _{y\in \{-1,1\}}e^{-y f(\mathbf {x})}\underbrace{e^{-y F(\mathbf {x})}p(\mathbf {x},y)/Z}_{w(\mathbf {x},y)}\,\mathbf {dx}\cdot Z}{=}{Q_w(f) \cdot Q_p(F)}. \end{aligned}$$
(15)

Therefore, to minimize \(Q(F+f)\) it suffices to greedily minimize \(Q_w(f)\). The distribution \(w\) indicates which “parts” of the target quantity are already explained.

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Klęsk, P., Kapruziak, M., Olech, B. (2015). Boosted Classifiers for Antitank Mine Detection in C-Scans from Ground-Penetrating Radar. In: Wiliński, A., Fray, I., Pejaś, J. (eds) Soft Computing in Computer and Information Science. Advances in Intelligent Systems and Computing, vol 342. Springer, Cham. https://doi.org/10.1007/978-3-319-15147-2_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-15147-2_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-15146-5

  • Online ISBN: 978-3-319-15147-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics