Optical Memory and Neural Networks

, Volume 27, Issue 3, pp 152–160 | Cite as

A New Type of a Wavelet Neural Network

  • A. EfitorovEmail author
  • S. DolenkoEmail author


Wavelet transformation uses a special basis widely known for its unique properties, the most important of which are its compactness and multiresolution (wavelet functions are produced from the mother wavelet by transition and dilation). Wavelet neural networks (WNN) use wavelet functions to decompose the approximated function. However, for a standard wavelet basis with fixed transition and dilation coefficients, the decomposition may be not optimal. If no inverse transformation is needed, the values of transition and dilation coefficients may be determined during network training, and the windows corresponding to various wavelet functions may overlap. In this study, we suggest a new type of a WNN—Adaptive Window WNN (AWWNN), designed primarily for signal processing, in which window positions and wavelet levels are determined with a special iterative procedure. Two modifications of this new type of WNN are tested against linear model and multi-layer perceptron on Mackey-Glass benchmark problem.


approximation wavelet neural networks wavelet analysis group method of data handling spectroscopy 



This study has been carried out with financial support of The Ministry of Education and Science of the Russian Federation, Agreement no. 14.604.21.0163, project identifier RFMEFI60417X0163.


  1. 1.
    Haykin, S., Neural Networks and Learning Machines, 3rd ed., Pearson Education, 2009.Google Scholar
  2. 2.
    Hornik, K., Stinchcombe, M., and White, H., Multilayer feedforward networks are universal approximators, Neural Networks, 1989, vol. 2, no. 5, pp. 359–366.CrossRefzbMATHGoogle Scholar
  3. 3.
    Cybenko, G., Approximation by superpositions of a sigmoidal function, Math. Control Signals, Syst., 1989, vol. 2, no. 4, pp. 303–314.MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Funahashi, K., On the approximate realization of continuous mappings by neural networks, Neural Networks, 1989, vol. 2, no. 3, pp. 183–192.CrossRefGoogle Scholar
  5. 5.
    Kolmogorov, A.N., On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition, Dokl. Akad. Nauk SSSR, 1957, vol. 114, pp. 953–956.MathSciNetzbMATHGoogle Scholar
  6. 6.
    Mikolov, T., Chen, K., Corrado, G., and Dean, J., Efficient Estimation of Word Representations in Vector Space; arXiv:1301.3781 [cs.CL].Google Scholar
  7. 7.
    Efitorov, A., Dolenko, S., Dolenko, T., Laptinskiy, K., and Burikov, S., Use of adaptive methods to solve the inverse problem of determination of composition of multi-component solutions, Opt. Mem. Neural Networks, 2018, vol. 27, no. 2, pp. 89–99. doi 10.3103/S1060992X18020042CrossRefGoogle Scholar
  8. 8.
    Efitorov, A.O., Burikov, S.A., Dolenko, T.A., Persiantsev, I.G., and Dolenko, S.A., Comparison of the quality of solving the inverse problems of spectroscopy of multi-component solutions with neural network methods and with the method of projection to latent structures, Opt. Mem. Neural Networks, 2015, vol. 24, no. 2, pp. 93–101. doi 10.3103/S1060992X15020022CrossRefGoogle Scholar
  9. 9.
    Daubechies, I., Ten Lectures on Wavelets, Philadelphia: SIAM, 1992.CrossRefzbMATHGoogle Scholar
  10. 10.
    Zhang, Q. and Benveniste, A., Wavelet networks, IEEE Trans. Neural Networks, 1992, vol. 6, pp. 889–898. doi 10.1109/72.16559CrossRefGoogle Scholar
  11. 11.
    Mallat, S., A Wavelet Tour of Signal Processing, N.Y.: Academic, 2008.zbMATHGoogle Scholar
  12. 12.
    Ricalde, L.J., Catzin, G.A., Alanis, A.Y., and Sanchez, E.N., Higher order wavelet neural networks with Kalman learning for wind speed forecasting, in 2011 IEEE Symposium on Computational Intelligence Applications in Smart Grid (CIASG), 2011. doi 10.1109/CIASG.2011.595333210.1109/CIASG.2011.5953332Google Scholar
  13. 13.
    Fang, Y., Fataliyev, K., Wang, L., Fu, X., and Wang, Y., Improving the genetic-algorithm-optimized wavelet neural network for stock market prediction, in 2014 International Joint Conference on Neural Networks (IJCNN), Beijing, 2014, pp. 3038–3042. doi 10.1109/IJCNN.2014.688996910.1109/IJCNN.2014.6889969Google Scholar
  14. 14.
    Tang, H., Sun, W., Zhang, W., Miao, S., and Yang, Y., Wavelet neural network method based on particle swarm optimization for obstacle recognition of power line Deicing robot, Jixie Gongcheng Xuebao (Chin. J. Mech. Eng.), 2017, vol. 53, pp. 55–63. doi 10.3901/JME.2017.13.055CrossRefGoogle Scholar
  15. 15.
    Ivakhnenko, A.G., Polynomial theory of complex systems, IEEE Trans. Systems, Man, and Cybernetics, 1971, vol. SMC-1(4), pp. 364–378. dop 10.1109/tsmc.1971.4308320Google Scholar
  16. 16.
    Bishop, C.M., Pattern Recognition and Machine Learning, Springer, 2006.zbMATHGoogle Scholar
  17. 17.
    Ouahabi, A., Ed., Signal and Image Multiresolution Analysis, Wiley, 2012.Google Scholar
  18. 18.
    Ruder, S., An overview of gradient descent optimization algorithms, 2017; arXiv:1609.04747v2.Google Scholar
  19. 19.
    Hramov, A.E., Koronovskii, A.A., Makarov, V.A., Pavlov, A.N., and Sitnikova, E., Wavelets in Neuroscience, Springer-Verlag Berlin Heidelberg, 2015. doi 10.1007/978-3-662-43850-3CrossRefzbMATHGoogle Scholar
  20. 20.
    Mackey, M. and Glass, L., Oscillation and chaos in physiological control systems, Science, 1977, vol. 197, no. 4300, pp. 287–289. doi 10.1126/science.267326CrossRefzbMATHGoogle Scholar
  21. 21.
    Keras: The Python Deep Learning Library. Scholar
  22. 22.
    TensorFlowTM: An open source machine learning framework for everyone. Scholar
  23. 23.
    Tange, O., GNU Parallel 2018. Mar. 2018, ISBN 9781387509881. doi 10.5281/zenodo.114601410.5281/ zenodo.1146014Google Scholar
  24. 24.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R., Dropout: A simple way to prevent neural networks from overfitting, J. Mach. Learn. Res., 2014, vol. 15, pp. 1929–1958. Scholar

Copyright information

© Allerton Press, Inc. 2018

Authors and Affiliations

  1. 1.Skobeltsyn Institute of Nuclear Physics, Moscow State UniversityMoscowRussia

Personalised recommendations