Automation and Remote Control

, Volume 80, Issue 9, pp 1653–1670 | Cite as

Randomized Machine Learning Procedures

  • Yu. S. PopkovEmail author
Topical Issue


A new concept of machine learning based on the computer simulation of entropy-optimal randomized models is proposed. The procedures of randomized machine learning (RML) with “hard” and “soft” randomization are considered; the former imply the exact reproduction of empirical balances while the latter their rough reproduction with an accepted approximation criterion. RML algorithms are formulated as functional entropy-linear programming problems. Applications of RML procedures to text classification and the randomized forecasting of migratory interaction of regional systems are presented.


randomization hard and soft randomization procedures uncertainty entropy matrix norms empirical balances text classification dynamic regression 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.



This work was supported by the Russian Foundation for Basic Research, project no. 17-29-02115.


  1. 1.
    Knowledge Discovery in Databases, Piatetsky-Shapiro, G., and Frawley, W., Eds., AAAI/MIT Press, 1991.Google Scholar
  2. 2.
    Witten, I.H. and Frank, E., Data Mining: Practical Learning Tools and Techniques, Morgan Kaufmann, 2005, 2nd ed.zbMATHGoogle Scholar
  3. 3.
    Editorial Community Cleverness Required, Nature, 2008, vol. 455, no. 1.Google Scholar
  4. 4.
    Rosenblatt, M., The Perceptron—Perceiving and Recognizing Automaton, Report 85-460-1, 1957.Google Scholar
  5. 5.
    Tsypkin, Ya.Z., Osnovy teorii obuchayushchikhsya sistem (Foundations of Learning Systems Theory), Moscow: Nauka, 1970.Google Scholar
  6. 6.
    Aizerman, M.A., Braverman, E.M., and Rozonoer, L.I., Metod potentsial'nykh funktsii v teorii obuche-niya mashin (The Method of Potential Functions in Theory of Machine Learning), Moscow: Nauka, 1970.Google Scholar
  7. 7.
    Vapnik, V.N. and Chervonenkis, A.Ya., Teoriya raspoznavaniya obrazov (Theory of Image Recognition), Moscow: Nauka, 1974.Google Scholar
  8. 8.
    Vapnik, V.N. and Chervonenkis, A.Ya., Vosstanovlenie zavisimostei po empiricheskim dannym (Restoration of Dependencies by Empirical Data), Moscow: Nauka, 1979.Google Scholar
  9. 9.
    Bishop, C.M., Pattern Recognition and Machine Learning, Information Theory and Statistics Series, New York: Springer-Verlag, 2006.zbMATHGoogle Scholar
  10. 10.
    Dempster, A.P., Laird, N.M., and Rubin, D.B., Maximum Likelihood from Incomplete Data via the EM Algorithm, J. Royal Statistical Society, Ser. B, 1977, no. 34, pp. 1–38.MathSciNetzbMATHGoogle Scholar
  11. 11.
    Zagoruiko, N.G., Prikladnye metody analiza dannykh i znanii (Applied Methods of Data and Knowledge Analysis), Novosibirsk: Nauka, 1998.Google Scholar
  12. 12.
    Jain, A., Murty, M., and Flunn, P., Data Clustering: A Review, ASM Comput. Surveys, 1999, vol. 31, no. 3, pp. 264–323.CrossRefGoogle Scholar
  13. 13.
    Hastie, T., Tibshirani, R., and Friedman, J., The Elements of Statistical Learning. Springer, 2001. CrossRefzbMATHGoogle Scholar
  14. 14.
    Vorontsov, K.V., Mathematical Methods of Learning by Precedents: A Course of Lectures, Moscow Institute of Physics and Technology, 2006.Google Scholar
  15. 15.
    Merkov, A.B., Raspoznavanie obrazov. Vvedenie v metody statisticheskogo obucheniya (Image Recognition. Introduction to Statistical Learning Methods), Moscow: Editorial URSS, 2010.Google Scholar
  16. 16.
    Zolotykh, N.Yu., Machine Learning and Data Analysis, 2013. Google Scholar
  17. 17.
    Flach, P., Machine Learning: The Art and Science of Algorithms that Make Sense of Data, Cambridge: Cambridge Univ. Press, 2012. Translated under the title Mashinnoe obuchenie, Moscow: DMK Press, 2015.CrossRefzbMATHGoogle Scholar
  18. 18.
    Abellan, J. and Castellano, J.G., Improving the Naive Bayes Classifier via a Quick Variable Selection Method Using Maximum of Entropy, Entropy, 2017, vol. 19, no. 6, pp. 246–254.CrossRefGoogle Scholar
  19. 19.
    Kullback, S. and Leibler, R.A., On Information and Sufficiency, Ann. Math. Stat., 1951, vol. 22(1), pp. 79–86.MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Kapur, J.N., Maximum Entropy Models in Science and Engineering, New York: Wiley, 1989.zbMATHGoogle Scholar
  21. 21.
    Jaynes, E.T., Information Theory and Statistical Mechanics, Phys. Rev. Notes, 1957, vol. 106, pp. 620–630.MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    The Maximum Entropy Formalism, Levin, R.D. and Tribus, M., Eds., Boston: MIT Press, 1979.Google Scholar
  23. 23.
    Jaynes, E.T., Papers on Probability, Statistics and Statistical Physics, Dordrecht: Kluwer, 1989.zbMATHGoogle Scholar
  24. 24.
    Jaynes, E.T., Probability Theory. The Logic and Science, Cambrige: Cambrige Univ. Press, 2003.CrossRefzbMATHGoogle Scholar
  25. 25.
    Racine, J. and Maasoumi, E., A Versatile and Robust Metric Entropy Test of Time-Reversibility, and Other Hypotheses, J. Econometrics, 2007, vol. 138, pp. 547–567.MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Voevodin, V.V. and Kuznetsov, Yu.A., Matritsy i vychisleniya (Matrices and Calculations), Moscow: Nauka, 1984.zbMATHGoogle Scholar
  27. 27.
    Kaashoek, M.A., Seatzu, S., and van der Mee, C., Recent Advances in Operator Theory and Its Applications, New York: Springer, 2006.Google Scholar
  28. 28.
    Ioffe, A.D. and Tikhomirov, V.M., Teoriya ekstremal'nykh zadach (Theory of Extremum Problems), Moscow: Nauka, 1974.Google Scholar
  29. 29.
    Tikhomirov, V.M., Alekseev, V.N., and Fomin, S.V., Optimal'noe upravlenie (Optimal Control), Moscow: Nauka, 1979.zbMATHGoogle Scholar
  30. 30.
    Darkhovskii, B.S., Popkov, Yu.S., and Popkov, A.Yu., Monte Carlo Method of Batch Iterations: Probabilistic Characteristics, Autom. Remote Control, 2015, vol. 76, no. 5, pp. 776–785.MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Popkov, Yu.S., Popkov, A.Yu., and Darkhovsky, B.S., Parallel Monte Carlo for Entropy Robust Estimation, Math. Models Comput. Simulations, 2016, vol. 8, no. 1, pp. 27–39.MathSciNetCrossRefGoogle Scholar
  32. 32.
    Kolmogorov, F.N. and Fomin, S.V., Elements of the Theory of Functions and Functional Analysis, Mineola: Duver, 1999.Google Scholar
  33. 33.
    Rubinstein, R.Y. and Kroese, D.P., Simulation and the Monte Carlo Method, New York: Wiley, 2008.zbMATHGoogle Scholar
  34. 34.
    Popkov, Yu.S., Dynamic Entropy Model for Migratory Interaction of Regional Systems, Tr. Inst. Sist. Analiz. Ross. Akad. Nauk, 2018, vol. 68, no. 3, pp. 3–11. Google Scholar

Copyright information

© Pleiades Publishing, Ltd. 2019

Authors and Affiliations

  1. 1.Federal Research Center for Information Science and ControlRussian Academy of SciencesMoscowRussia
  2. 2.Trapeznikov Institute of Control SciencesRussian Academy of SciencesMoscowRussia
  3. 3.Braude College of Haifa UniversityKarmielIsrael
  4. 4.Yugra Research Institute of Information TechnologiesKhanty-MansiyskRussia
  5. 5.Moscow Institute of Physics and TechnologyDolgoprudnyRussia

Personalised recommendations