Skip to main content

Saliency Analysis of Support Vector Machines for Feature Selection in Financial Time Series Forecasting

  • Chapter
Computational Intelligence in Economics and Finance

Part of the book series: Advanced Information Processing ((AIP))

  • 840 Accesses

Abstract

This chapter deals with the application of saliency analysis to Support. Vector Machines (SVMs) for feature selection. The importance of feature is ranked by evaluating the sensitivity of the network output to the feature input in terms of the partial derivative. A systematic approach to remove irrelevant features based on the sensitivity is developed. Two simulated non-linear time series and five real financial time series are examined in the experiment. The simulation results show that that saliency analysis is effective in SVMs for identifying important features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Muller, R., Smola, J. A., Scholkopf, B. (1997): Prediction Time Series with Support Vector Machines. In: Proceedings of International Conference on Artificial Neural Networks, 999

    Google Scholar 

  2. Mukherjee, S., Osuna, E., Girosi, F. (1997): Nonlinear Prediction of Chaotic Time Series Using Support Vector Machines. Proc. Of IEEE NNSP’97, Amelia Isl, FL

    Google Scholar 

  3. Vapnik, V. N., Golowich, S. E., Smola, A. J. (1996): Support Vector Method for Function Approximation, Rregression Estimation and Signal Processing. Advances in Neural Information Processing Systems, Vol. 9, 281–287

    Google Scholar 

  4. Muller, K. R., Smola, J. A., Ratsch, G., Scholkopf, B., Kohlmorgen, J. (1999): Prediction Time Series with Support Vector Machines. Advances in Kernel Methods, The MIT Press

    Google Scholar 

  5. Barzilay, O., Brailovsky, V. L. (1999): On Domain Knowledge and Feature Selection Using a Support Vector Machine. Pattern Recognition Letters, Vol. 20, 475–484

    Google Scholar 

  6. Steppe, J. M., Bauer, Jr. K. W. (1997): Feature Saliency Measures. Computers Math. Application, Vol. 33, 109–126

    Google Scholar 

  7. Reed, R. (1993): Pruning Algorithms-a Survey. IEEE Transactions on Neural Networks, Vol. 4, 740–747

    Google Scholar 

  8. Ruck, D. W., Rogers, S. K., Kabrisky, M. (1990): Feature Selection Using a Multilayer Perceptron. Journal of Neural Network Computing, Vol. 2, 40–48

    Google Scholar 

  9. Belue, L. M., Bauer, Jr. K. W. (1995): Determining Input Features for Multi-layer Perceptrons. Neurocomputing, Vol. 7, 111–121

    Google Scholar 

  10. Stepp, J. M., Bauer, Jr. K. W. (1996): Improved Feature Screening in Feed-forward Neural Networks. Neurocomputing, 1347–58

    Google Scholar 

  11. Zurada, M. J., Malinowski, A., Usui, S. (1997): Perturbation Method for Deleting Redundant inputs of Perceptron Networks. Neurocomputing, Vol. 14, 177193

    Google Scholar 

  12. Vapnik, V. N. (1995): The Nature of Statistical Learning Theory. Springer-Verlag

    Google Scholar 

  13. Smola, A. J., Scholkopf, B. (1998): A Tutorial on Support Vector Regression. NeuroCOLT Technical Report TR, Royal Holloway College

    Google Scholar 

  14. Smola, A. J. (1998): Learning With Kernels. Ph.D. Thesis GMD, Birlinghoven

    Google Scholar 

  15. Cibas, T., Soulie, F. F., Gallinari, P., Raudys, S. (1996): Variable Selection with Neural Networks. Neurocomputing, Vol. 12, 223–248

    Google Scholar 

  16. Sexton, R. S. (1998): Identifying Irrelevant Input Variables in Chaotic Time Series Problems: Using a Genetic Algorithm for Training Neural Networks. Journal of Computational Intelligence in Finance, 34–41

    Google Scholar 

  17. Murray, W. H. (1992): Microsoft C/C++7: The Complete Reference. Osborne McGraw-Il ill

    Google Scholar 

  18. Thomason, M. (1999): The Practitioner Methods and Tool. Journal of Computational Intelligence in Finance, Vol. 7, No. 3, 36–45

    Google Scholar 

  19. Thomason, M. (1999): The Practitioner Methods and Tool. Journal of Computational Intelligence in Finance, Vol. 7, No. 4, 35–45

    MathSciNet  Google Scholar 

  20. Thomason, M. (1999): The Practitioner Methods and Tool. Journal of Computational intelligence in Finance, Vol. 7, No. 6, 35–45

    MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Cao, L., Tay, F.E.H. (2004). Saliency Analysis of Support Vector Machines for Feature Selection in Financial Time Series Forecasting. In: Chen, SH., Wang, P.P. (eds) Computational Intelligence in Economics and Finance. Advanced Information Processing. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-06373-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-06373-6_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-07902-3

  • Online ISBN: 978-3-662-06373-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics