Skip to main content

Using Locally Weighted Learning to Improve SMOreg for Regression

  • Conference paper
PRICAI 2006: Trends in Artificial Intelligence (PRICAI 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4099))

Included in the following conference series:

Abstract

Shevade et al.[1] are successful in extending some improved ideas to Smola and Scholkopf’s SMO algorithm[2] for solving regression problems, simply named SMOreg. In this paper, we use SMOreg in exactly the same way as linear regression(LR) is used in locally weighted linear regression[5](LWLR): a local SMOreg is fit to a subset of the training instances that is in the neighborhood of the test instance whose target function value is to be predicted. The training instances in this neighborhood are weighted, with less weight being assigned to instances that are further from the test instance. A regression prediction is then obtained from SMOreg taking the attribute values of the test instance as input. We called our improved algorithm locally weighted SMOreg, simply LWSMOreg. We conduct extensive empirical comparison for the related algorithms in two groups in terms of relative mean absolute error, using the whole 36 regression data sets obtained from various sources and recommended by Weka[3]. In the first group, we compare SMOreg[1] with NB[4](naive Bayes), KNNDW[5](k-nearest-neighbor with distance weighting), and LR. In the second group, we compare LWSMOreg with SMOreg, LR, and LWLR. Our experimental results show that SMOreg performs well in regression and LWSMOreg significantly outperforms all the other algorithms used to compare.

This work was supported by Excellent Youth Foundation of China University of Geosciences(No.CUGQNL0505).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 239.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Shevade, S.K., Keerthi, S.S., Bhattacharyya, C., Murthy, K.R.K.: Improvements to SMO Algorithm for SVM Regression, Technical Report CD-99-16, Control Division, Dept. of Mechanical and Production Engineering, National University of Singapore, Singapore (1999)

    Google Scholar 

  2. Smola, A.J., Scholkopf, B.: A tutorial on support vector regression, NeuroCOLT2 Technical Report Series NC2-TR-1998-030, ESPRIT working group on Neural and Computational Learning Theory Neuro- COLT 2 (1998)

    Google Scholar 

  3. Witten, I.H., Frank, E.: Data mining-Practical Machine Learning Tools and Techniques with Java Implementation. Morgan Kaufmann, San Francisco (2000), http://prdownloads.sourceforge.net/weka/datasets-numeric.jar

    Google Scholar 

  4. Frank, E., Trigg, L., Holmes, G., Witten, I.H.: Naive bayes for regression. Machine Learning 41, 5–15 (2000)

    Article  Google Scholar 

  5. Mitchell, T.M.: Instance-Based Learning. In: Machine Learning, ch. 8, McGraw-Hill, New York (1997)

    Google Scholar 

  6. Platt, J.: Fast training of support vector machines using sequential minimal optimization. In: Scholkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods. Support Vector Learning, pp. 185–208. MIT Press, Cambridge (1999)

    Google Scholar 

  7. Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO algorithm for SVM classifier design. Technical Report CD-99-14, Dept. of Mechanical and Production Engineering, Natl. Univ. Singapore, Singapore (1999)

    Google Scholar 

  8. Atkeson, C.G., Moore, A.W., Schaal, S.: Locally Weighted Learning. Artificial Intelligence Review 11(1-5), 11–73 (1997)

    Article  Google Scholar 

  9. Frank, E., Hall, M., Pfahringer, B.: Locally Weighted Naive Bayes. In: Proceedings of the Conference on Uncertainty in Artificial Intelligence (2003), pp. 249–256. Morgan Kaufmann, San Francisco (2003)

    Google Scholar 

  10. Nadeau, C., Bengio, Y.: Inference for the generalization error. Advances in Neural Information Processing Systems 12, 307–313 (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, C., Jiang, L. (2006). Using Locally Weighted Learning to Improve SMOreg for Regression. In: Yang, Q., Webb, G. (eds) PRICAI 2006: Trends in Artificial Intelligence. PRICAI 2006. Lecture Notes in Computer Science(), vol 4099. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-36668-3_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-36668-3_41

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-36667-6

  • Online ISBN: 978-3-540-36668-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics