Introduction to the Theory of Randomized Machine Learning

  • Yuri S. Popkov
  • Yuri A. Dubnov
  • Alexey Y. Popkov
Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 756)

Abstract

We propose a new machine learning concept called Randomized Machine Learning, in which model parameters are assumed random and data are assumed to contain random errors. Distinction of this approach from “classical” machine learning is that optimal estimation deals with the probability density functions of random parameters and the “worst” probability density of random data errors. As the optimality criterion of estimation, randomized machine learning employs the generalized information entropy maximized on a set described by the system of empirical balances. We apply this approach to text classification and dynamic regression problems. The results illustrate capabilities of the approach.

References

  1. 1.
    Avellaneda, M.: Minimum-relative-entropy calibration of asset-pricing models. Int. J. Theor. Appl. Finance 1(04), 447–472 (1998)CrossRefMATHGoogle Scholar
  2. 2.
    Bishop, C.: Pattern Recognition and Machine Learning (Information Science and Statistics), 1st edn. 2006. corr. 2nd printing edn. Springer, New York (2007)Google Scholar
  3. 3.
    Boucheron, S., Bousquet, O., Lugosi, G.: Theory of classification: A survey of some recent advances. ESAIM: probab. Stat. 9, 323–375 (2005)Google Scholar
  4. 4.
    Flach, P.: Machine Learning: The Art and Science of Algorithms that Make Sense of Data. Cambridge University Press (2012)Google Scholar
  5. 5.
    Friedman, J., Hastie, T., Tibshirani, R.: The elements of statistical learning. In: Springer Series in Statistics, vol. 1. Springer, Berlin (2001)Google Scholar
  6. 6.
    Gonzalo, J.A., Muñoz, F.F., Santos, D.J.: Using a rate equations approach to model world population trends. Simulation 89(2), 192–198 (2013)CrossRefGoogle Scholar
  7. 7.
    Huber, P.J.: Robust Statistics. Springer (2011)Google Scholar
  8. 8.
    Ioffe, A.D., Tikhomirov, V.M.: Teoriya ekstremal’nykh zadach (Theory of Extremal Problems). Nauka, Moscow (1974)Google Scholar
  9. 9.
    Jaynes, E.T.: Information theory and statistical mechanics. Phys. Rev. 106(4), 620–630 (1957)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Jaynes, E.T.: Probability Theory: The Logic of Science. Cambridge University Press (2003)Google Scholar
  11. 11.
    Kaashoek, M.A., et al.: Recent Advances in Operator Theory and its Applications: the Israel Gohberg Anniversary Volume, vol. 160. Springer Science & Business Media (2005)Google Scholar
  12. 12.
    Kapur, J.N.: Maximum-Entropy Models in Science and Engineering. Wiley (1989)Google Scholar
  13. 13.
    Kolmogorov, F.N., Fomin, S.V.: Elements of the Theory of Functions and Functional Analysis, vol 1. Courier Corporation (1999)Google Scholar
  14. 14.
    Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22(1), 79–86 (1951)MathSciNetCrossRefMATHGoogle Scholar
  15. 15.
    Popkov, Y.S.: Macrosystems theory and its applications (equilibrium models). In: Lecture Notes in Control and Information Sciences. Springer (1995)Google Scholar
  16. 16.
    Popkov, Y.S., Dubnov, Y.A., Popkov, A.Y.: New method of randomized forecasting using entropy-robust estimation: application to the world population prediction. Mathematics 4, 1–16 (2016a)CrossRefMATHGoogle Scholar
  17. 17.
    Popkov, Y.S., Popkov, A.Y., Darkhovsky, B.S.: Parallel monte carlo for entropy robust estimation. Math. Models Comput. Simul. 8(1), 27–39 (2016b)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Racine, J.S., Maasoumi, E.: A versatile and robust metric entropy test of time-reversibility, and other hypotheses. J. Econom. 138(2), 547–567 (2007)MathSciNetCrossRefMATHGoogle Scholar
  19. 19.
    Rosenblatt, F.: The perceptron, a Perceiving and Recognizing Automaton Project Para. Cornell Aeronautical Laboratory (1957)Google Scholar
  20. 20.
    Rubinstein, R.Y., Kroese, D.P.: Simulation and the Monte Carlo Method, vol. 707. Wiley (2011)Google Scholar
  21. 21.
    Tsypkin, Y.Z.: Osnovy teorii obuchayushchikhsya sistem (Foundations of Theory of Learning Systems). Nauka, Moscow (1970)Google Scholar
  22. 22.
    Tsypkin, Y.Z., Popkov, Y.S.: Teoriya nelineinykh impul’snykh sistem (Theory of Nonlinear Impulse Systems). Nauka, Moscow (1973)Google Scholar
  23. 23.
    Vapnik, V.N.: Vosstanovlenie zavisimostei po empiricheskim dannym (Restoration of Dependencies Using Emprirical Data). Nauka, Moscow (1979)Google Scholar
  24. 24.
    Vapnik, V.N., Chervonenkis, A.Y.: Teoriya raspoznavaniya obrazov. Nauka, Moscow (1974)MATHGoogle Scholar
  25. 25.
    Volterra, V.: Theory of Functionals and of Integral and Integro–Differential Equations. Dover Publications (2005)Google Scholar
  26. 26.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann (2005)Google Scholar
  27. 27.
    Yzerman, M.A., Braverman, E.M., Rozonoer, L.I.: Metod potentsialnykh funktsii v teorii obucheniya mashin. Nauka, Moscow (1970)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Yuri S. Popkov
    • 1
    • 2
    • 3
  • Yuri A. Dubnov
    • 1
    • 2
    • 3
  • Alexey Y. Popkov
    • 1
    • 2
  1. 1.Institute for Systems Analysis of Federal Research Center “Computer Science and Control” of Russian Academy of SciencesMoscowRussia
  2. 2.Moscow Institute of Phisics and TechnologyMoscowRussia
  3. 3.National Research University “Higher School of Economics”MoscowRussia

Personalised recommendations