Advertisement

Abstract

Support Vector Machine is one of the classical machine learning techniques that can still help solve big data classification problems. Especially, it can help the multidomain applications in a big data environment. However, the support vector machine is mathematically complex and computationally expensive. The main objective of this chapter is to simplify this approach using process diagrams and data flow diagrams to help readers understand theory and implement it successfully. To achieve this objective, the chapter is divided into three parts: (1) modeling of a linear support vector machine; (2) modeling of a nonlinear support vector machine; and (3) Lagrangian support vector machine algorithm and its implementations. The Lagrangian support vector machine with simple examples is also implemented using the R programming platform on Hadoop and non-Hadoop systems.

Keywords

Support Vector Machine Feature Space Support Vector Machine Classifier Data Domain Linear Support Vector Machine 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

I would like to thank Professor Vaithilingam (Jeya) Jeyakumar of the University of New South Wales, Australia, for giving me an opportunity to work with him and his research team on support vector machine problems and associated implementations to different applications. I also participated in the research focusing on enhancing the support vector machine technique and published our theory, results, and findings. This research contributed to this chapter.

References

  1. 1.
    M. A. Hearst, S. T. Dumais, E. Osman, J. Platt, and B. Scholkopf. “Support vector machines.” Intelligent Systems and their Applications, IEEE, 13(4), pp. 18–28, 1998.CrossRefGoogle Scholar
  2. 2.
    T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. New York: Springer, 2009.zbMATHCrossRefGoogle Scholar
  3. 3.
    B. Scholkopf, S. Mika, C. J. C. Burges, P. Knirsch, K. R. Muller, G. Ratsch and A. J. Smola. “Input space versus feature space in kernel-based methods,” IEEE Trans. On Neural Networks, vol. 10, no. 5, pp. 1000–1017, 1999.CrossRefPubMedGoogle Scholar
  4. 4.
    G. Huang, H. Chen, Z. Zhou, F. Yin and K. Guo. “Two-class support vector data description.” Pattern Recognition, 44, pp. 320–329, 2011.zbMATHCrossRefGoogle Scholar
  5. 5.
    V. Franc, and V. Hlavac. “Multi-class support vector machine.” In Proceedings of the IEEE 16th International Conference on Pattern Recognition, vol. 2, pp. 236–239, 2002.Google Scholar
  6. 6.
  7. 7.
    M. Dunbar, J. M. Murray, L. A. Cysique, B. J. Brew, and V. Jeyakumar. “Simultaneous classification and feature selection via convex quadratic programming with application to HIV-associated neurocognitive disorder assessment.” European Journal of Operational Research 206(2): pp. 470–478, 2010.zbMATHCrossRefGoogle Scholar
  8. 8.
    V. Jeyakumar, G. Li, and S. Suthaharan. “Support vector machine classifiers with uncertain knowledge sets via robust optimization.” Optimization, pp. 1–18, 2012.Google Scholar
  9. 9.
    O. L. Mangasarian and D. R. Musicant. 2000. “LSVM Software: Active set support vector machine classification software.” Available online at http://research.cs.wisc.edu/dmi/lsvm/.
  10. 10.
    M. Dunbar. “Optimization approaches to simultaneous classification and feature selections,” Technical Report (supervised by V. Jeyakumar) School of Mathematics and Statistics, The University of New South Wales, Australia, pp. 1–118, 2007.Google Scholar
  11. 11.
  12. 12.
  13. 13.

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Shan Suthaharan
    • 1
  1. 1.Department of Computer ScienceUNC GreensboroGreensboroUSA

Personalised recommendations