Convergence in Orlicz spaces by means of the multivariate max-product neural network operators of the Kantorovich type and applications
- 21 Downloads
In this paper, convergence results in a multivariate setting have been proved for a family of neural network operators of the max-product type. In particular, the coefficients expressed by Kantorovich type means allow to treat the theory in the general frame of the Orlicz spaces, which includes as particular case the \(L^p\)-spaces. Examples of sigmoidal activation functions are discussed, for the above operators in different cases of Orlicz spaces. Finally, concrete applications to real-world cases have been presented in both univariate and multivariate settings. In particular, the case of reconstruction and enhancement of biomedical (vascular) image has been discussed in detail.
KeywordsSigmoidal function Multivariate max-product neural network operator Orlicz space Modular convergence Neurocomputing process Data modeling Image processing
Mathematics Subject Classification41A25 41A05 41A30 47A58
The authors would like to thank the referees for their useful suggestions which led us to insert the section devoted to real-world applications.
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
Ethical approval was waived considering that the CT images analyzed were anonymized and the results did not influence any clinical judgment.
- 1.Adler A, Guardo R (1994) A neural network image reconstruction technique for electrical impedance tomography. IEEE Trans Med Imaging 13(4):594–600Google Scholar
- 7.Asdrubali F, Baldinelli G, Bianchi F, Costarelli D, Evangelisti L, Rotili A, Seracini M, Vinti G (2018) A model for the improvement of thermal bridges quantitative assessment by infrared thermography. Appl Energy 211:854–864Google Scholar
- 8.Ball KR, Grant C, Mundy WR, Shafera TJ (2017) A multivariate extension of mutual information for growing neural networks. Neural Netw 95:29–43Google Scholar
- 14.Bono-Nuez A, Bernal-Ruíz C, Martín-del-Brío B, Pérez-Cebolla FJ, Martínez-Iturbe A (2017) Recipient size estimation for induction heating home appliances based on artificial neural networks. Neural Comput Appl 28(11):3197–3207Google Scholar
- 18.Cao F, Liu B, Park DS (2013) Image classification based on effective extreme learning machine. Neurocomputing 102:90–97Google Scholar
- 37.Goh ATC (1995) Back-propagation neural networks for modeling complex systems. Artif Intell Eng 9:143–151Google Scholar
- 39.Guliyev NJ, Ismailov VE (2018) On the approximation by single hidden layer feedforward neural networks with fixed weights. Neural Networks 98:296–304Google Scholar
- 40.Guliyev NJ, Ismailov VE (2018) Approximation capability of two hidden layer feedforward neural networks with fixed weights. Neurocomputing 316:262–269Google Scholar
- 45.Livingstone DJ (2008) Artificial neural networks: methods and applications (methods in molecular biology). Humana Press, New YorkGoogle Scholar
- 49.Olivera JJ (2017) Global exponential stability of nonautonomous neural network models with unbounded delays. Neural Netw 96:71–79Google Scholar
- 50.Rister B, Rubin DL (2017) Piecewise convexity of artificial neural networks. Neural Netw 94:34–45Google Scholar
- 52.Stamov G, Stamova I (2017) Impulsive fractional-order neural networks with time-varying delays: almost periodic solutions. Neural Comput Appl 28(11):3307–3316Google Scholar