Linearly and Quadratically Separable Classifiers Using Adaptive Approach

  • Mohamed Abdel-Kawy Mohamed Ali SolimanEmail author
  • Rasha M. Abo-Bakr


This paper presents a fast adaptive iterative algorithm to solve linearly separable classification problems in \( {R^n} \). In each iteration, a subset of the sampling data (n-points, where n is the number of features) is adaptively chosen and a hyperplane is constructed such that it separates the chosen n-points at a margin ϵ and best classifies the remaining points. The classification problem is formulated and the details of the algorithm are presented. Further, the algorithm is extended to solving quadratically separable classification problems. The basic idea is based on mapping the physical space to another larger one where the problem becomes linearly separable. Numerical illustrations show that few iteration steps are sufficient for convergence when classes are linearly separable. For nonlinearly separable data, given a specified maximum number of iteration steps, the algorithm returns the best hyperplane that minimizes the number of misclassified points occurring through these steps. Comparisons with other machine learning algorithms on practical and benchmark datasets are also presented, showing the performance of the proposed algorithm.


linear classification quadratic classification iterative approach adaptive technique 

Supplementary material

11390_2011_188_MOESM1_ESM.pdf (80 kb)
(PDF 79.9 KB)


  1. 1.
    Duda R O, Hart P E, Stork D G. Pattern Classification. New York: Wiley-Interscience, 2000.Google Scholar
  2. 2.
    Theodoridis S, Koutroumbas K. Pattern Recognition. Academic Press, An Imprint of Elsevier, 2006.zbMATHGoogle Scholar
  3. 3.
    Cristianini N, Shawe T J. An Introduction to Support Vector Machines. Vol. I, Cambridge University Press, 2003.Google Scholar
  4. 4.
    Atiya A. Learning with kernels: Support vector machines, regularization, optimization, and beyond. IEEE Transactions on Neural Networks, 2005, 16(3): 781.CrossRefGoogle Scholar
  5. 5.
    Rosenblatt F. Principles of Neurodynamics. Spartan Books, 1962.Google Scholar
  6. 6.
    Taha H A. Operations Research An Introduction. Macmillan Publishing Co., Inc, 1982.Google Scholar
  7. 7.
    Zurada J M. Introduction to Artificial Neural Systems. Boston: PWS Publishing Co., USA, 1999.Google Scholar
  8. 8.
    Barber C B, Dodkin D P, Huhdanpaa H. The quickhull algorithm for convex hulls. ACM Transactions on Mathematical Software, 1996, 22(4): 469–483.zbMATHCrossRefGoogle Scholar
  9. 9.
    Tajine M, Elizondo D. New methods for testing linear separability. Neurocomputing, 2002, 47(1–4): 295–322.Google Scholar
  10. 10.
    Elizondo D. Searching for linearly separable subsets using the class of linear separability method. In Proc. IEEE-IJCNN, Budapest, Hungary, Jul. 25–29, 2004, pp.955-960.Google Scholar
  11. 11.
    Elizondo D. The linear separability problem: Some testing methods. IEEE Transactions on Neural Networks, 2006, 17(2): 330–344.CrossRefGoogle Scholar
  12. 12.
  13. 13.
    Fisher R A. The Use of Multiple Measurements in Taxonomic Problems. Annals of Eugenics, 1936, 7: 179–188.Google Scholar
  14. 14.
  15. 15.
    Witten I H, Frank E. Data Mining: Practical Machine Learning Tools and Techniques. Elsevier, 2005.Google Scholar

Copyright information

© Springer Science+Business Media, LLC & Science Press, China 2011

Authors and Affiliations

  • Mohamed Abdel-Kawy Mohamed Ali Soliman
    • 1
    Email author
  • Rasha M. Abo-Bakr
    • 2
  1. 1.Department of Computer and Systems Engineering, Faculty of EngineeringZagazig UniversityZagazigEgypt
  2. 2.Departement of Mathematics, Faculty of ScienceZagazig UniversityZagazigEgypt

Personalised recommendations