Skip to main content

Industrial Prediction Intervals with Data Uncertainty

  • Chapter
  • First Online:
Data-Driven Prediction for Industrial Processes and Their Applications

Part of the book series: Information Fusion and Data Science ((IFDS))

Abstract

Prediction intervals (PIs) construction is a comprehensive prediction technique that provides not only the point estimates of the industrial variables, but also the reliability of the prediction results indicated by an interval. Reviewing the conventional PIs construction methods (e.g., delta method, mean and variance-based estimation method, Bayesian method, and bootstrap technique), we provide some recently developed approaches in this chapter. Here, a bootstrapping-based ESN ensemble (BESNE) model is specially proposed to produce reliable PIs for industrial time series, in which a simultaneous training method based on Bayesian linear regression is developed. Besides, to cope with the error accumulation caused by the traditional iterative mode of time series prediction, a non-iterative granular ESN is also reported for PIs construction, where the network connections are represented by the interval-valued information granules. In addition, we present a mixed Gaussian kernel-based regression model to construct PIs, in which a gradient descent algorithm is derived to optimize the hyper-parameters of the mixed Gaussian kernel. In order to tackle the incomplete testing input problem, a kernel-based high order dynamic Bayesian network (DBN) model for industrial time series is then proposed, which directly deals with the missing points involved in the inputs. Finally, we provide some case studies to verify the effectiveness of these approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zapranis, A., & Livanis, E. (2005). Prediction intervals for neural network models. In Proceedings of the 9th WSEAS International Conference on Computers. World Scientific and Engineering Academy and Society (WSEAS)

    Google Scholar 

  2. De Veaux, R. D., Schumi, J., Schweinsberg, J., & Ungar, L. H. (1998). Prediction intervals for neural networks via nonlinear regression. Technometrics, 40(4), 273–282.

    Article  MathSciNet  Google Scholar 

  3. Hwang, J. T. G., & Ding, A. A. (1997). Prediction intervals for artificial neural networks. Journal of the American Statistical Association, 92(438), 748–757.

    Article  MathSciNet  Google Scholar 

  4. Nix, D. A., & Weigend, A. S. (1994). Estimating the mean and variance of the target probability distribution. In Proceedings of the IEEE International Conference on Neural Networks, Orlando, FL (Vol. 1, pp. 55–60).

    Google Scholar 

  5. Rivals, I., & Personnaz, L. (2000). Construction of confidence intervals for neural networks based on least squares estimation. Neural Networks, 13(4–5), 463.

    Article  Google Scholar 

  6. Ding, A., & He, X. (2003). Backpropagation of pseudo-errors: Neural networks that are adaptive to heterogeneous noise. IEEE Transactions on Neural Networks, 14(2), 253–262.

    Article  Google Scholar 

  7. Dybowski, R., & Roberts, S. (2000). Confidence intervals and prediction intervals for feed-forward neural networks. In R. Dybowski & V. Gant (Eds.), Clinical applications of artificial neural networks. Cambridge, U.K: Cambridge University Press.

    Google Scholar 

  8. Bishop, C. M. (1995). Neural networks for pattern recognition. London, UK: Oxford University Press.

    MATH  Google Scholar 

  9. MacKay, D. J. C. (1989). The evidence framework applied to classification networks. Neural Computation, 4(5), 720–736.

    Article  Google Scholar 

  10. Hagan, M., & Menhaj, M. (2002). Training feedforward networks with the Marquardt algorithm. IEEE Transactions on Neural Networks, 5(6), 989–993.

    Article  Google Scholar 

  11. Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Annals of Statistics, 7(1), 1–26.

    Article  MathSciNet  Google Scholar 

  12. Heskes, T. (1997). Practical confidence and prediction intervals. In T. P. M. Mozer & M. Jordan (Eds.), Neural information processing systems (Vol. 9, pp. 176–182). Cambridge, MA: MIT Press.

    Google Scholar 

  13. Sheng, C., Zhao, J., Wang, W., et al. (2013). Prediction intervals for a noisy nonlinear time series based on a bootstrapping reservoir computing network ensemble. IEEE Transactions on Neural Networks & Learning Systems, 24(7), 1036–1048.

    Article  Google Scholar 

  14. Tibshirani, R. (1996). A comparison of some error estimates for neural network models. Neural Computation, 8(1), 152–163.

    Article  MathSciNet  Google Scholar 

  15. Khosravi, A., Nahavandi, S., Creighton, D., et al. (2011). Comprehensive review of neural network-based prediction intervals and new advances. IEEE Transactions on Neural Networks, 22(9), 1341–1356.

    Article  Google Scholar 

  16. Anguita, D., Ghio, A., Oneto, L., et al. (2012). In-sample and out-of-sample model selection and error estimation for support vector machines. IEEE Transactions on Neural Networks & Learning Systems, 23(9), 1390.

    Article  Google Scholar 

  17. Efron, B., & Tibshirani, R. J. (1993). An introduction to the bootstrap. New York, USA: Chapman & Hall.

    Book  Google Scholar 

  18. Efron, B., & Tibshirani, R (1995). Cross-validation and the bootstrap: Estimating the error rate of a prediction rule. Dept. Stat., Stanford Univ., Stanford, CA, USA, Tech. Rep. TR-477.

    Google Scholar 

  19. Arlot, S., & Celisse, A. (2010). A survey of cross-validation procedures for model selection. Statistics Surveys, 4, 40–79.

    Article  MathSciNet  Google Scholar 

  20. Efron, B., & Tibshirani, R. (1997). Improvements on cross-validation: The .632+ bootstrap method. Journal of the American Statistical Association, 92(438), 548–560.

    MathSciNet  MATH  Google Scholar 

  21. Xue, Y., Yang, L., & Haykin, S. (2007). Decoupled echo state networks with lateral inhibition. Neural Networks, 20(3), 365–376.

    Article  Google Scholar 

  22. Sheng, C., Zhao, J., & Wang, W. (2017). Map-reduce framework-based non-iterative granular echo state network for prediction intervals construction. Neurocomputing, 222, 116–126.

    Article  Google Scholar 

  23. Dong, R., & Pedrycz, W. (2008). A granular time series approach to long-term forecasting and trend forecasting. Physica A: Statistical Mechanics and its Applications, 387(13), 3253–3270.

    Article  Google Scholar 

  24. Song, M., & Pedrycz, W. (2013). Granular neural networks: Concepts and development schemes. IEEE Transactions on Neural Networks & Learning Systems, 24(4), 542–553.

    Article  Google Scholar 

  25. Cimino, A., Lazzerini, B., Marcelloni, F., et al. (2011). Granular data regression with neural networks, Fuzzy logic and applications. Lecture Notes in Computer Science, 6857, 172–179.

    Article  Google Scholar 

  26. Jaeger, H., & Haas, H. (2004). Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science, 304, 78–80.

    Article  Google Scholar 

  27. Jaeger, H. (2002). Tutorial on training recurrent neural networks, covering BPTT, RTRL, EKF and echo state network approach. German National Research Center for Information Technology, GMD Rep. 159.

    Google Scholar 

  28. Zhao, J., Wang, W., Liu, Y., et al. (2011). A two-stage online prediction method for a blast furnace gas system and its application. IEEE Transactions on Control Systems Technology, 19(3), 507–520.

    Article  Google Scholar 

  29. Zhao, J., Liu, Q., Wang, W., et al. (2012). Hybrid neural prediction and optimized adjustment for coke oven gas system in steel industry. IEEE Transactions on Neural Networks and Learning Systems, 23(3), 439–450.

    Article  Google Scholar 

  30. Liu, Y., Liu, Q., Wang, W., et al. (2012). Data-driven based model for flow prediction of steam system in steel industry. Information Sciences, 193, 104–114.

    Article  Google Scholar 

  31. Pedrycz, W., & Homenda, W. (2013). Building the fundamentals of granular computing: A principle of justifiable granularity. Applied Soft Computing, 13(10), 4209–4218.

    Article  Google Scholar 

  32. Jones, J. A., Evans, D., & Kemp, S. E. (2007). A note on the Gamma test analysis of noisy input/output data and noisy time series. Physica D: Nonlinear Phenomena, 229(1), 1–8.

    Article  MathSciNet  Google Scholar 

  33. Liitiainen, E., Verleysen, M., Corona, F., & Lendasse, A. (2009). Residual variance estimation in machine learning. Neurocomputing, 72(16–18), 3692–3703.

    Article  Google Scholar 

  34. Evans, D., & Jones, A. J. (2002). A proof of the gamma test. Society of London. Series A, 458, 2759–2799.

    Article  MathSciNet  Google Scholar 

  35. Khosravi, A., Nahavandi, S., Creighton, D., et al. (2011). Lower upper bound estimation method for construction of neural network-based prediction intervals. IEEE Transactions on Neural Networks, 22(3), 337–346.

    Article  Google Scholar 

  36. Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks (pp. 1942–1948). Piscataway: IEEE Service Center.

    Google Scholar 

  37. Mackey, M. C., & Glass, L. (1977). Oscillation and chaos in physiological control systems. Science, 197(4300), 287–289.

    Article  Google Scholar 

  38. De, B. K., De, B. J., Suykens, J. A., et al. (2011). Approximate confidence and prediction intervals for least squares support vector regression. IEEE Transactions on Neural Networks, 22(1), 110–120.

    Article  Google Scholar 

  39. Bishop, C. M. (2006). Pattern recognition and machine learning. New York: Springer Press.

    MATH  Google Scholar 

  40. Vapnik, V. (1995). The nature of statistical learning theory. New York: Springer.

    Book  Google Scholar 

  41. Boyd, V., & Faybusovich, L. (2006). Convex optimization. IEEE Transactions on Automatic Control, 51(11), 1859–1859.

    Article  Google Scholar 

  42. Wright, W. A. (1999). Bayesian approach to neural-network modeling with input uncertainty. IEEE Transactions on Neural Networks, 10(6), 1261.

    Article  Google Scholar 

  43. Chen, L., Liu, Y., Zhao, J., Wang, W., & Liu, Q. (2016). Prediction intervals for industrial data with incomplete input using kernel-based dynamic Bayesian networks. Artificial Intelligence Review, 46, 307–326.

    Article  Google Scholar 

  44. Fung, R., & Chang, K. C. (1990). Weighting and integrating evidence for stochastic simulation in Bayesian networks. In P. P. Bonissone, M. Henrion, L. N. Kanal, & J. F. Lemmer (Eds.), Uncertainty in artificial intelligence (Vol. 5, pp. 208–219). North Holland: Elsevier.

    Google Scholar 

  45. Tipping, M. E. (2001). Sparse Bayesian learning and the relevance vector machine. Journal of Machine Learning Research, 1, 211–244.

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Zhao, J., Wang, W., Sheng, C. (2018). Industrial Prediction Intervals with Data Uncertainty. In: Data-Driven Prediction for Industrial Processes and Their Applications. Information Fusion and Data Science. Springer, Cham. https://doi.org/10.1007/978-3-319-94051-9_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-94051-9_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-94050-2

  • Online ISBN: 978-3-319-94051-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics