Skip to main content

Applying REC Analysis to Ensembles of Sigma-Point Kalman Filters

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4132))

Abstract

The Sigma-Point Kalman Filters (SPKF) is a family of filters that achieve very good performance when applied to time series. Currently most researches involving time series forecasting use the Sigma-Point Kalman Filters, however they do not use an ensemble of them, which could achieve a better performance. The REC analysis is a powerful technique for visualization and comparison of regression models. The objective of this work is to advocate the use of REC curves in order to compare the SPKF and ensembles of them and select the best model to be used.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bi, J., Bennett, K.P.: Regression Error Characteristic Curves. In: Proceedings of the 20th International Conference on Machine Learning (ICML), Washington, DC, pp. 43-50 (2003)

    Google Scholar 

  2. Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases. Machinereadable data repository, University of California, Department of Information and Computer Science, Irvine, CA (2005), http://www.ics.uci.edu/~mlearn/MLRepository.html

  3. Caruana, R., Niculescu-Mizil, A.: An Empirical Evaluation of Supervised Learning for ROC Area. In: Proceedings of the First Workshop on ROC Analysis (ROCAI), pp. 1-8 (2004)

    Google Scholar 

  4. Dietterich, T.G.: Machine Learning Research: Four Current Directions. The AI Magazine 18, 97–136 (1998)

    Google Scholar 

  5. Doucet, A., de Freitas, N., Gordon, N.: Sequential Monte-Carlo Methods in Practice. Springer, Heidelberg (2001)

    MATH  Google Scholar 

  6. Dzeroski, S., Zenko, B.: Is Combining Classifiers with Stacking Better than Selecting the Best One? Machine Learning 54, 255–273 (2004)

    Article  MATH  Google Scholar 

  7. Frank, E., Trigg, L., Holmes, G., Witten, I.H.: Naive Bayes for Regression. Machine Learning 41, 5–25 (2000)

    Article  Google Scholar 

  8. Ito, K., Xiong, K.: Gaussian Filters for Nonlinear Filtering Problems. IEEE Transactions on Automatic Control 45, 910–927 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  9. van der Merwe, R., Wan, E.: Efficient Derivative-Free Kalman Filters for Online Learning. In: Proceedings of the 9th European Symposium on Artificial Neural Networks (ESANN), Bruges, Belgium (2001)

    Google Scholar 

  10. van der Merwe, R., Wan, E.: Sigma-Point Kalman Filters for Probabilistic Inference in Dynamic State-Space Models. In: Proceedings of the Workshop on Advances in Machine Learning, Montreal, Canada (2003)

    Google Scholar 

  11. Jazwinsky, A.: Stochastic Processes and Filtering Theory. Academic Press, New York (1970)

    Google Scholar 

  12. Julier, S., Uhlmann, J., Durrant-Whyte, H.: A New Approach for Filtering Nonlinear Systems. In: Proceedings of the American Control Conference, pp. 1628–1632 (1995)

    Google Scholar 

  13. Keogh, E., Folias, T.: The UCR Time Series Data Mining Archive. University of California, Computer Science & Engineering Department, Riverside, CA (2002), http://www.cs.ucr.edu/~eamonn/TSDMA/index.html

  14. Provost, F., Fawcett, T.: Analysis and Visualization of Classifier Performance: Comparison Under Imprecise Class and Cost Distributions. In: Proceedings of the International Conference on Knowledge Discovery and Data Mining (KDD), pp. 43–48. AAAI Press, Menlo Park (1997)

    Google Scholar 

  15. Provost, F., Fawcett, T., Kohavi, R.: The Case Against Accuracy Estimation for Comparing Classifiers. In: Proceedings of the 15th International Conference on Machine Learning (ICML), pp. 445–453. Morgan Kaufmann, San Francisco (1998)

    Google Scholar 

  16. Quinlan, J.R.: Learning with Continuous Classes. In: Proceedings of the 5th Australian Joint Conference on Artificial Intelligence, pp. 343–348. World Scientific, Singapore (1992)

    Google Scholar 

  17. Teixeira, M., Zaverucha, G.: Fuzzy Bayes and Fuzzy Markov Predictors. Journal of Intelligent and Fuzzy Systems 13, 155–165 (2003)

    Google Scholar 

  18. Wolpert, D.: Stacked generalization. Neural Networks 5, 241–260 (1992)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

de Pina, A.C., Zaverucha, G. (2006). Applying REC Analysis to Ensembles of Sigma-Point Kalman Filters. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4132. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840930_16

Download citation

  • DOI: https://doi.org/10.1007/11840930_16

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-38871-5

  • Online ISBN: 978-3-540-38873-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics