Abstract
We investigate the problem of automatic antitank mine detection. Subject to detection are 3D images, so-called C-scans, generated by a GPR (Ground-Penetrating Radar) system of our construction. In the paper we focus on boosting as a machine learning approach well suited for large-scale data such as GPR data. We compare five variants of weak learners with real-valued responses trained by the same boosting scheme. Three of the variants are single-feature-based learners that differ in the way they approximate class conditional distributions. The two remaining variants are shallow decision trees, respectively, with four and eight terminal nodes, introducing joint-feature conditionals.
This project has been partially financed by the Polish Ministry of Science and Higher Education; agreement no. 0091/R/TOO/2010/12 for R&D project no. 0 R00 0091 12, dated on 30.11.2010, carried out by a consortium of Military Institute of Armament Technology in Zielonka, and Autocomp Management in Szczecin, Poland.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Area under Receiver Operating Characteristic curve.
- 2.
- 3.
At the moment of writing this manuscript, we submitted a paper to the IEEE Trans. on Geoscience and Remote Sensing journal, proposing a technique for fast calculation of moments (11) for successive windows via multiple integral images.
- 4.
Even about \(100\) windows for a dense image traversal, e.g., with \(1\) pixel shifts.
References
Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth & Brooks, Monterey (1984)
Cremer, F., et al.: Feature level fusion of polarimetric infrared and GPR data for landmine detection. In: Proceedings of EUDEM2-SCOT 2003, International Conference on Requirements and Technologies for the Detection, Removal and Neutralization of Landmines and UXO, vol. 2, pp. 638–642 (2003)
Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Ann. Stat. 28(2), 337–407 (2000)
Frigui, H., et al.: Context-dependent multisensor fusion and its application to land mine detection. IEEE Trans. Geosci. Remote Sens. 48(6), 2528–2543 (2010)
Hamdi, A., Missaoui, O., Frigui, H.: An SVM classifier with HMM-based kernel for landmine detection using ground penetrating radar. In: IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp. 4196–4199 (2010)
Horng, M.H.: Texture feature coding method for texture classification. Opt. Eng. 42(1), 228–238 (2002)
Jol, H.M.: Ground Penetrating Radar: Theory and Applications. Elsevier, Oxford (2009)
Klęsk, P., Godziuk, A., Kapruziak, M., Olech, B.: Landmine detection in 3D images from ground penetrating radar using Haar-like features. In: ICAISC 2013, Zakopane. Lecture Notes in Artificial Intelligence, vol. 7894, pp. 559–567. Springer, Berlin (2013)
Ligthart, E.E., Yarovoy, A.G., Roth, F., Ligthart, L.P.: Landmine detection in high resolution 3D GPR images. In: MIKON’04, pp. 423–426 (2004)
Mease, D., et al.: Boosted classification trees and class probability/quantile estimation. J. Mach. Learn. Res. 8, 409–439 (2007)
Missaoui, O., Frigui, H., Gader, P.: Land-mine detection with ground-penetrating radar using multistream discrete hidden Markov models. IEEE Trans. Geosci. Remote Sens. 49(6), 2080–2099 (2011)
Rasolzadeh, B., et al.: Response binning: improved weak classifiers for boosting. In: IEEE Intelligent Vehicles Symposium, pp. 344–349 (2006)
Schapire, R.E.: The strength of weak learnability. Mach. Learn. 5, 197–227 (1990)
Shi, Y., et al.: Landmine detection using boosting classifiers with adaptive feature selection. In: IEEE 6th International Workshop on Advanced Ground Penetrating Radar (IWAGPR), pp. 1–5 (2011)
Sun, Y., Li, J.: Adaptive learning approach to landmine detection. IEEE Trans. Aerosp. Electron. Syst. 41(3), 973–985 (2005)
Torrione, P., Collins, L.M.: Texture features for Antitank landmine detection using ground penetrating radar. IEEE Trans. Geosci. Remote Sens. 45(7), 2374–2382 (2007)
Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR’2001), pp. 511–518 (2001)
Yarovoy, A.: Landmine and unexploded ordnance detection and classification with ground penetrating radar. In: Jol, H.M. (ed.) Ground Penetrating Radar: Theory and Applications, pp. 445–478. Elsevier, Oxford (2009)
Yarovoy, A.G., Kovalenko, V., Fogar, L.P.: Impact of ground clutter on buried object detection by ground penetrating radar. In: International Geoscience and Remote Sensing Symposium, pp. 755–777 (2003)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix A: Notes on Boosting’s Connection to Logistic Regression
Appendix A: Notes on Boosting’s Connection to Logistic Regression
Following [3], we point out some important properties of boosting in the context of its connection to logistic regression and additive modeling.
Think of the unknown joint probability distribution (population) from which the data is drawn. Let \(p(\mathbf {x},y)\) denote its density function, which can also be expressed as \(p(\mathbf {x},y)=P(y|\mathbf {x})p(\mathbf {x})\). Now, consider the exponential criterion for some classifier function \(F\) defined as an expectation taken with respect to \(p\):
To minimize \(Q_p\) we demand that \(\partial Q_p(F)/\partial F=0\) (in fact it suffices to minimize the inner conditional expectation) and obtain the formula for the minimizer \(F^*(\mathbf {x}) = 1/2 \log \left( P(y{=}1|\mathbf {x})/P(y{=}{-1}|\mathbf {x})\right) \) being half the logit transform, typical for logistic regression. If an algorithm could somehow immediately find the perfect function \(F^*\), then the boosting procedure could be stopped after just one round. In practice, weak learners are rough approximations of \(F^*\) and multiple rounds are needed. Solving \(F^*\) for probability leads to a form of sigmoid:
and again the similarity to logistic regression can be seen. The two minor differences are: the factor of \(2\) in the exponent, and the fact that in the traditional logistic regression one approximates \(F^*\) by a linear model \(F^*(\mathbf {x})\approx a_0+a_1 x_1 + \cdots + a_n x_n\), whereas in boosting it is a linear combination of weak learners, i.e., \(F^*(\mathbf {x})\approx \sum _k f_k(\mathbf {x})\), so in general arbitrary but simple functions, each being possibly a function of all variables.
Think of the error residuals technique, known from regression modeling in general. In an additive model built sequentially, each successive piece of approximation “explains" some part of the target quantity and that part is subtracted from the target, so that future pieces can focus on residuals. Boosting’s reweighing scheme proceeds in an akin manner. Suppose we have fixed a partial model \(F\) and would like to update it, i.e., to have \(F:=F+f\). Recall the data-based reweighing formulas (2). Let us define their population-based counterparts:
Note that \(Z\) works as a normalizer but simultaneously is our exponential criterion value for the model so far—\(Q_p(F)\). Now, consider the criterion at \(F+f\):
Therefore, to minimize \(Q(F+f)\) it suffices to greedily minimize \(Q_w(f)\). The distribution \(w\) indicates which “parts” of the target quantity are already explained.
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Klęsk, P., Kapruziak, M., Olech, B. (2015). Boosted Classifiers for Antitank Mine Detection in C-Scans from Ground-Penetrating Radar. In: Wiliński, A., Fray, I., Pejaś, J. (eds) Soft Computing in Computer and Information Science. Advances in Intelligent Systems and Computing, vol 342. Springer, Cham. https://doi.org/10.1007/978-3-319-15147-2_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-15147-2_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-15146-5
Online ISBN: 978-3-319-15147-2
eBook Packages: EngineeringEngineering (R0)