Advertisement

Generating Random Variates via Kernel Density Estimation and Radial Basis Function Based Neural Networks

  • Cristian Candia-GarcíaEmail author
  • Manuel G. ForeroEmail author
  • Sergio Herrera-Rivera
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11401)

Abstract

When modeling phenomena that cannot be studied by deterministic analytical approaches, one of the main tasks is to generate random variates. The widely-used techniques, such as the inverse transformation, convolution, and rejection-acceptance methods, involve a significant amount of statistical work and do not provide satisfactory results when the data do not conform to the known probability density functions. This study aims to propose an alternative nonparametric method for generating random variables that combines kernel density estimation (KDE), and radial basis function based neural networks (RBFBNNs). We evaluate the method’s performance using Poisson, triangular, and exponential probability density distributions and assessed its utility for unknown distributions. The results show that the model’s effectiveness depends substantially on selecting an appropriate bandwidth value for KDE and a certain minimum number of data points to train the algorithm. the proposed method enabled us to achieve an \( R^{2} \) value between 0.91 and 0.99 for analyzed distributions.

Keywords

General regression neural network Probabilistic neural network Kernel density estimation Random variable Probability distribution 

References

  1. 1.
    Banks, J. (ed.): Handbook of Simulation: Principles, Methodology, Advances, Applications, and Practice, 1 edn.  Wiley-Interscience, New York/Norcross (1998)Google Scholar
  2. 2.
    Krishnamoorthy, K.: Handbook of Statistical Distributions with Applications, 2nd edn. CRC Press, Boca Raton (2016)CrossRefGoogle Scholar
  3. 3.
    Svensson, C., Hannaford, J., Prosdocimi, I.: Statistical distributions for monthly aggregations of precipitation and streamflow in drought indicator applications. Water Resour. Res. 53(2), 999–1018 (2017)CrossRefGoogle Scholar
  4. 4.
    Gerber, M.S.: Predicting crime using Twitter and kernel density estimation. Decis. Support Syst. 61(Suppl. C), 115–125 (2014)CrossRefGoogle Scholar
  5. 5.
    Zipkin, E.F., Leirness, J.B., Kinlan, B.P., O’Connell, A.F., Silverman, E.D.: Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation. Stat. Methodol. 17(Suppl. C), 67–81 (2014)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Berkson, J.: Some difficulties of interpretation encountered in the application of the chi-square test. J. Am. Stat. Assoc. 33(203), 526–536 (1938)CrossRefGoogle Scholar
  7. 7.
    Massey, F.J.: The Kolmogorov-Smirnov test for goodness of fit. J. Am. Stat. Assoc. 46(253), 68–78 (1951)CrossRefGoogle Scholar
  8. 8.
    Gutiérrez, M., Agustín, P., Gómez-Restrepo, C.: Beyond p value. Rev. Colomb. Psiquiatr. 38(3), 574–586 (2009)Google Scholar
  9. 9.
    Rosenblatt, M.: Remarks on some nonparametric estimates of a density function. Ann. Math. Stat. 27(3), 832–837 (1956)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Silverman, B.W.: Algorithm AS 176: kernel density estimation using the fast fourier transform. J. R. Stat. Soc. Ser. C Appl. Stat. 31(1), 93–99 (1982)zbMATHGoogle Scholar
  11. 11.
    Heidenreich, N.-B., Schindler, A., Sperlich, S.: Bandwidth selection for kernel density estimation: a review of fully automatic selectors. AStA Adv. Stat. Anal. 97(4), 403–433 (2013)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Agarwal, R., Chen, Z., Sarma, S.V.: A novel nonparametric maximum likelihood estimator for probability density functions. IEEE Trans. Pattern Anal. Mach. Intell. 39(7), 1294–1308 (2017)CrossRefGoogle Scholar
  13. 13.
    Padilla, O.H.M., Scott, J.G.: Nonparametric density estimation by histogram trend filtering. arXiv:150904348 Stat, September 2015
  14. 14.
    Xu, X., Yan, Z., Xu, S.: Estimating wind speed probability distribution by diffusion-based kernel density method. Electr. Power Syst. Res. 121, 28–37 (2015)CrossRefGoogle Scholar
  15. 15.
    Arora, S., Taylor, J.W.: Forecasting electricity smart meter data using conditional kernel density estimation. Omega 59(Part A), 47–59 (2016)CrossRefGoogle Scholar
  16. 16.
    Barabesi, L., Pratelli, L.: Universal methods for generating random variables with a given characteristic function. J. Stat. Comput. Simul. 85(8), 1679–1691 (2015)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Magdon-Ismail, M., Atiya, A.: Density estimation and random variate generation using multilayer networks. IEEE Trans. Neural Netw. 13(3), 497–520 (2002)CrossRefGoogle Scholar
  18. 18.
    Alzaatreh, A., Lee, C., Famoye, F.: A new method for generating families of continuous distributions. METRON 71(1), 63–79 (2013)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Bringmann, K., Friedrich, T.: Exact and efficient generation of geometric random variates and random graphs. In: Fomin, F.V., Freivalds, R., Kwiatkowska, M., Peleg, D. (eds.) ICALP 2013. LNCS, vol. 7965, pp. 267–278. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-39206-1_23CrossRefzbMATHGoogle Scholar
  20. 20.
    Specht, D.F.: Probabilistic neural networks. Neural Netw. 3(1), 109–118 (1990)CrossRefGoogle Scholar
  21. 21.
    Specht, D.F.: A general regression neural network. IEEE Trans. Neural Netw. 2(6), 568–576 (1991)CrossRefGoogle Scholar
  22. 22.
    Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2012)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Faculty of EngineeringEscuela Colombiana de Ingeniería Julio GaravitoBogotáColombia
  2. 2.Faculty of EngineeringUniversidad de IbaguéIbaguéColombia

Personalised recommendations