Robust stability analysis of quaternion-valued neural networks via LMI approach
- 77 Downloads
Abstract
This paper is concerned with the issue of robust stability for quaternion-valued neural networks (QVNNs) with leakage, discrete and distributed delays by employing a linear matrix inequality (LMI) approach. Based on the homeomorphic mapping theorem, the quaternion matrix theorem and the Lyapunov theorem, some criteria are developed in the form of real-valued LMIs for guaranteeing the existence, uniqueness, and global robust stability of the equilibrium point of the delayed QVNNs. Two numerical examples are provided to demonstrate the effectiveness of the obtained results.
Keywords
Quaternion-valued neural networks Delay effects Global robust stability Linear matrix inequality Modulus inequality technique1 Introduction
The quaternions are members of a noncommutative division algebra invented independently by Carl Friedrich Gauss in 1819 and William Rowan Hamilton in 1843 [1]. Quaternions provide a concise mathematical method for representing the automorphisms of three- and four-dimensional spaces. The representations by quaternions are more compact and quicker to compute than the representations by matrices [2]. For this reason, an increasing number of applications based on quaternions are found in various fields, such as computer graphics, quantum mechanics, attitude control, signal processing, and orbital mechanics [3, 4, 5]. For example, in attitude-control systems, it will lead to the problem of the so-called “gimbal lock” by using Euler angles. As an alternative approach, quaterions have the technical advantage not to suffer from the problem [6].
On the other hand, over the past three decades, neural networks (NNs) have been applied in various areas throughout science and engineering, such as signal processing, image processing, pattern recognition, associative memory and optimization [7, 8, 9, 10, 11, 12, 13, 14, 15, 16]. Furthermore, real-valued NNs (RVNNs) and complex-valued NNs (CVNNs) have been extensively investigated and a great number of results have been reported [17, 18, 19, 20, 21, 22]. Recently, quaternion-valued neural networks (QVNNs) have drawn a great deal of attention [23, 24]. Due to the simple representation of quaternions and the high efficiency in dealing with multidimensional data, QVNNs have demonstrated better performances than CVNNs and RVNNs in their wide applications [25, 26, 27, 28, 29, 30, 31]. For example, in the image compression [25, 26], one color is synthesized by the three primary colors with a certain proportion, which needs three real- or complex-valued neurons to store. However, one quaternion-valued neuron is enough to represent one color via three channels ı, ȷ and κ of QVNNs, which leads to a significant reduction in the dimension of the system and to a great increase in the computational efficiency. In some practical applications, it is required that the designed system has a unique equilibrium point which is globally stable. Therefore, the dynamics of QVNNs has been an active research topic [32, 33, 34, 35, 36]. In [32, 33], some μ-stability criteria in the form of linear matrix inequalities (LMIs) were provided for QVNNs with time-varying delays. In [34], several sufficient conditions were presented to check the global exponential stability for QVNNs with time-varying delays. In [35], several sufficient criteria were derived to ensure the existence, uniqueness, and global robust stability of the equilibrium point for delayed QVNNs with parameter uncertainties. In [36], some algebraic conditions on the global dissipativity for QVNNs with time-varying delays were devised.
Moreover, when implementing neural networks, time delays are unavoidably encountered due to delay transmission line, partial element equivalent circuit, integration and communication. The existence of time delays in neural networks frequently will lead to undesirable complex dynamical behaviors [37, 38, 39]. As pointed out by Gopalsamy [40], time delays in the negative feedback terms will have a tendency to destabilize a system, which are known as leakage or forgetting delays. Moreover, many biological and artificial neural networks contain inherent discrete time delays in signal transmission, which may cause oscillation and instability. Furthermore, since the neural networks usually are of a spatial nature associated with the presence of an amount of parallel pathway of a variety of axon sizes and lengths, it is desirable to model them by introducing distributed delays. In [40, 41], the stability problem was investigated for RVNNs with the introduction of the leakage delays. In [42, 43], some dynamical behaviors of RVNNs with distributed delays were studied. In [44], the multistability issue of competitive RVNNs with discrete and distributed delays was investigated. In [45, 46], the authors considered the effects of leakage and discrete delays in CVNNs.
Strongly motivated by the above discussions, in the present paper we consider the robust stability problem of QVNNs with leakage, discrete and distributed delays. There are two main challenging problems of the current research. The first one is how to construct a proper Lyapunov–Krasovskii functional corresponding to the considered QVNNs. The second one is how to make sure that the obtained criteria are depend on the upper and lower bounds of system parameters. For the first one, we adopt quaternion self-conjugate and positive definite matrices to construct the Lyapunov–Krasovskii functional so that we can directly deal with the QVNNs rather than any decomposition. For the second one, we utilize modulus inequality technique to compute the derivative of the Lyapunov–Krasovskii functional so that the obtained criteria are not only real-valued but also related to the bounds of parameters.
Notations
Throughout this paper, \(\mathbb{R}\), \(\mathbb{C}\) and \(\mathbb{H}\) denote the real field, the complex field and the skew field of quaternions, respectively. \(\mathbb{R}^{n}\), \(\mathbb{C}^{n}\) and \(\mathbb{H}^{n}\) denote n-dimensional vectors with entries from \(\mathbb{R}\), \(\mathbb{C}\) and \(\mathbb{H}\), respectively. \(\mathbb{R}^{n\times m}\), \(\mathbb{C}^{n\times m}\) and \(\mathbb {H}^{n\times m}\) denote \(n\times m\) matrices with entries from \(\mathbb{R}\), \(\mathbb{C}\) and \(\mathbb{H}\), respectively. Specially, \(\mathbb{R}^{n\times n}_{d}\) denotes \(n\times n\) real diagonal matrices. The notation Ā, \(A^{T}\) and \(A^{*}\) stand for the conjugate, the transpose and the conjugate transpose, respectively, of the matrix A. For \(A=(a_{ij})_{n\times n}\in\mathbb{C}^{n\times n}\), let \(\|A\|=\sqrt{\sum_{i=1}^{n}\sum_{j=1}^{n} |a_{ij}|^{2}}\) denote the norm of A. The notation \(X\geq Y\) (respectively, \(X>Y\)) means that \(X-Y\) is positive semi-definite (respectively, positive definite). For a positive definite Hermitian matrix \(P\in\mathbb{C}^{n\times n}\), \(\lambda_{\max}(P)\) and \(\lambda_{\min}(P)\) are defined as the largest and the smallest eigenvalues of P, respectively. In the four-dimensional algebra \(\mathbb{H}\), the four basis elements are denoted by 1, ı, ȷ, κ, which obey the following multiplication table: \(\imath ^{2}=\jmath ^{2}=\kappa ^{2}=-1\), \(\imath \jmath =-\jmath \imath =\kappa \), \(\jmath \kappa =-\kappa \jmath =\imath \), \(\kappa \imath =-\imath \kappa =\jmath \), and \(1\cdot a=a\cdot1=a\) for every quaternion a. For a quaternion \(a=a_{0}+a_{1}\imath +a_{2}\jmath +a_{3}\kappa \in\mathbb{H}\), we call \(a_{0}\), \(a_{1}\), \(a_{2}\) and \(a_{3}\) the first, second, third and fourth parts of the quaternion, respectively. Let \(a^{*}=a_{0}-a_{1}\imath -a_{2}\jmath -a_{3}\kappa \) be the conjugate of a, and \(|a|=\sqrt{a_{0}^{2}+a_{1}^{2}+a_{2}^{2}+a_{3}^{2}}\) be the modulus of a. For \(q=(q_{1},q_{2},\ldots,q_{n})^{T}\in\mathbb{H}^{n}\), let \(|q|=(|q_{1}|,|q_{2}|,\ldots,|q_{n}|)^{T}\) be the modulus of q, and \(\|q\|=\sqrt{\sum_{i=1}^{n}|q_{i}|^{2}}\) be the norm of q. For \(a,b\in\mathbb{H}\), \(a\preceq b\) denotes \(a_{i}\leq b_{i}\), \(i=0,1,2,3\), where \(a=a_{0}+a_{1}\imath +a_{2}\jmath +a_{3}\kappa \) and \(b=b_{0}+b_{1}\imath +b_{2}\jmath +b_{3}\kappa \). For \(A, B\in\mathbb{H}^{n\times n}\), \(A\preceq B\) denotes \(a_{ij}\preceq b_{ij}\), \(i,j=1,2,\ldots,n\), where \(A=(a_{ij})_{n\times n}\) and \(B=(b_{ij})_{n\times n}\). In addition, the symbol ⋆ always denotes the conjugate transpose of a suitable block in a Hermitian matrix.
2 Problem formulation and preliminaries
- (A1)
- The parameters D, A, B, C, J in QVNNs (1) are assumed to be in the following sets, respectively,where \(\check{D}, \hat{D}\in\mathbb{R}^{n\times n}_{d}\), \(\check{A}, \hat{A}, \check{B}, \hat{B}, \check{C}, \hat{C}\in\mathbb {H}^{n\times n}\), and \(\check{J}, \hat{J}\in\mathbb{H}^{n}\). Moreover, let \(\check{A}=(\check{a}_{ij})_{n\times n}\), \(\hat{A}=(\hat{a}_{ij})_{n\times n}\), \(\check{B}=(\check{b}_{ij})_{n\times n}\), \(\hat{B}=(\hat{b}_{ij})_{n\times n}\), \(\check{C}=(\check{c}_{ij})_{n\times n}\), and \(\hat{C}=(\hat{c}_{ij})_{n\times n}\). Then we define \(\tilde{A}=(\tilde{a}_{ij})_{n\times n}\), \(\tilde{B}=(\tilde{b}_{ij})_{n\times n}\) and \(\tilde{C}=(\tilde{c}_{ij})_{n\times n}\), where \(\tilde{a}_{ij}=\max\{|\check{a}_{ij}|,|\hat{a}_{ij}|\}\), \(\tilde{b}_{ij}=\max\{|\check{b}_{ij}|,|\hat{b}_{ij}|\}\) and \(\tilde{c}_{ij}=\max\{|\check{c}_{ij}|,|\hat{c}_{ij}|\}\).$$\begin{aligned}& D_{I}= \bigl\{ D\in\mathbb{R}^{n\times n}_{d}: 0< \check{D}\preceq D\preceq\hat {D} \bigr\} , \\& A_{I}= \bigl\{ A\in\mathbb{H}^{n\times n}: \check{A}\preceq A\preceq \hat {A} \bigr\} , \\& B_{I}= \bigl\{ B\in\mathbb{H}^{n\times n}: \check{B}\preceq B\preceq \hat {B} \bigr\} , \\& C_{I}= \bigl\{ C\in\mathbb{H}^{n\times n}: \check{C}\preceq C\preceq \hat {C} \bigr\} , \\& J_{I}= \bigl\{ J\in\mathbb{H}^{n}: \check{J}\preceq J\preceq \hat{J} \bigr\} , \end{aligned}$$
- (A2)
- For \(i=1,2,\ldots,n\), the neuron activation function \(f_{i}\) is continuous and satisfieswhere \(\gamma_{i}\) is a real constant. Moreover, define \(\Gamma=\operatorname{diag}(\gamma_{1},\gamma_{2},\ldots ,\gamma_{n})\).$$ \bigl\vert f_{i}(s_{1})-f_{i}(s_{2}) \bigr\vert \leq\gamma_{i}|s_{1}-s_{2}|,\quad \forall s_{1},s_{2}\in \mathbb{H}, $$
Definition 1
The QVNNs defined by (1) with the parameter ranges defined by (A1) are globally asymptotically robust stable if the unique equilibrium point q̌ of QVNNs (1) is globally asymptotically stable for all \(D\in D_{I}\), \(A\in A_{I}\), \(B\in B_{I}\), \(C\in C_{I}\) and \(J\in J_{I}\).
Lemma 1
([35])
Lemma 2
([35])
- (i)
\(H(z)\)is injective on\(\mathbb{H}^{n}\),
- (ii)
\(\lim_{\|z\|\to\infty}\|H(z)\|=\infty\),
Lemma 3
([35])
In the following, we provide some modulus inequalities of quaternions, which play a major role in analyzing the problem in this paper.
Lemma 4
Proof
It should be noted that \(|a+b|\leq|a|+|b|\) for any \(a,b\in\mathbb{H}\). By the Cauchy–Schwarz inequality, the modulus inequalities (2) and (3) can be obtained. We omit the details because the proof is direct. □
Remark 1
Lemma 5
(Schur complement [47])
- (i)
\(S_{22}<0\)and\(S_{11}-S_{12}S_{22}^{-1}S_{21}<0\),
- (ii)
\(S_{11}<0\)and\(S_{22}-S_{21}S_{11}^{-1}S_{12}<0\).
3 Main results
In this section, we first analyze the existence and uniqueness of the equilibrium point of the delayed QVNNs under Assumptions (A1) and (A2). Then we investigate the global robust stability of the equilibrium point of the delayed QVNNs.
Theorem 1
Proof
In what follows, we further consider the global robust stability of the equilibrium point based on Theorem 1.
Theorem 2
Proof
The proof will be divided into two steps. First, we will proof the QVNNs (1) have a unique equilibrium point under LMI (9) based on Theorem 1. Second, we will prove the equilibrium point is globally robust stable by the Lyapunov theorem.
Remark 2
It should be noted that RVNNs and CVNNs are special cases of QVNNs. So the results of the paper can also be applied to RVNNs and CVNNs in the form of (1).
4 Numerical examples
In this section, two numerical examples will illustrate the effectiveness of the proposed results.
Example 1
Remark 3
Although the NNs (1) are quaternion-valued, the stability criteria are expressed in the form of LMIs (4) and (9), which are real-valued. In Example 1, we see that these LMIs can be checked directly by the mathematical software MATLAB.
Remark 4
In [35], the authors considered the robust stability of QVNNs with both leakage and discrete delays but without distributed delay. The criteria obtained in [35] cannot be applied to check the robust stability of the system in Example 1, since the system has distributed delay.
Remark 5
The first part of the state trajectories for system (1) with \(\delta=4.6\)
The first part of the state trajectories for system (1) with \(\delta=5.5\)
The first part of the state trajectories for system (1) with \(\delta=8\)
Example 2
Remark 6
In Example 2, the system parameters are complex-valued. So the system with these parameters can be viewed as CVNNs. Moreover, since \(\delta=0\) and \(\check{C}=\hat{C}=0\), the CVNNs have no leakage delay nor distributed delay. Then we try to apply the criteria in [22] to check the robust stability of the CVNNs. Via YALMIP with solver of SDPT3 in MATLAB, we cannot find feasible solutions of LMIs in [22]. Therefore, the results obtained in [22] cannot be applied to check the robust stability of the CVNNs.
5 Conclusion
In this paper, the robust stability problem of parametric uncertain QVNNs with both leakage, discrete and distributed delays has been investigated. Based on Homeomorphic mapping theorem and Lyapunov theorem, some criteria are obtained to check the existence, uniqueness, and global robust stability of the equilibrium point of the delayed QVNNs. Owing to using LMI approach and the modulus inequality technique, the presentation of the obtained criteria is in the form of real-valued LMIs, which can be solved by the mathematical software MATLAB directly and feasibly. In addition, two numerical examples are provided to substantiate the effectiveness of the proposed LMI conditions. It should be noted that the activation functions are continuous in this paper. Considering that the discontinuous neural network is one of the important dynamic systems, therefore, our further works will research the stability problem of QVNNs with discontinuous activations.
Notes
Acknowledgements
This work was supported in part by the National Natural Science Foundation of China under Grant 61773004, in part by the Natural Science Foundation of Chongqing Municipal Education Commission under Grant KJ1705138 and Grant KJ1705118, and in part by the Natural Science Foundation of Chongqing under Grant cstc2017jcyjAX0082.
Authors’ contributions
XC conceived, designed and performed the experiments. XC, LL and ZL wrote the paper. All authors read and approved the final manuscript.
Competing interests
The authors declare that they have no competing interests.
References
- 1.Simmons, G.F.: Calculus Gems: Brief Lives and Memorable Mathematics. McGraw-Hill, New York (1992) MATHGoogle Scholar
- 2.Conway, J.H., Smith, D.A.: On Quaternions and Octonions: Their Geometry, Arithmetic, and Symmetry. AK Peters, Natick (2003) MATHGoogle Scholar
- 3.Matsui, N., Isokawa, T., Kusamichi, H., Peper, F., Nishimura, H.: Quaternion neural network with geometrical operators. J. Intell. Fuzzy Syst. 15(3–4), 149–164 (2004) MATHGoogle Scholar
- 4.Adler, S.L.: Quaternionic Quantum Mechanics and Quantum Fields. Oxford University Press, New York (1995) MATHGoogle Scholar
- 5.Ujang, B.C., Took, C.C., Mandic, D.P.: Quaternion-valued nonlinear adaptive filtering. IEEE Trans. Neural Netw. 22(8), 1193–1206 (2011) CrossRefGoogle Scholar
- 6.Mazinan, A.H., Pasand, M., Soltani, B.: Full quaternion based finite-time cascade attitude control approach via pulse modulation synthesis for a spacecraft. ISA Trans. 58, 567–585 (2015) CrossRefGoogle Scholar
- 7.Zeng, Z., Wang, J.: Design and analysis of high-capacity associative memories based on a class of discrete-time recurrent neural networks. IEEE Trans. Syst. Man Cybern., Part B, Cybern. 38(6), 1525–1536 (2008) CrossRefGoogle Scholar
- 8.Lu, J., Ho, D.W.C., Wu, L.: Exponential stabilization of switched stochastic dynamical networks. Nonlinearity 22(4), 889–911 (2009) MathSciNetCrossRefMATHGoogle Scholar
- 9.Tanaka, G., Aihara, K.: Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction. IEEE Trans. Neural Netw. 20, 1463–1473 (2009) CrossRefGoogle Scholar
- 10.Lu, J., Ho, D.W.C.: Stabilization of complex dynamical networks with noise disturbance under performance constraint. Nonlinear Anal., Real World Appl. 12(4), 1974–1984 (2011) MathSciNetCrossRefMATHGoogle Scholar
- 11.Zhang, W., Tang, Y., Miao, Q., Du, W.: Exponential synchronization of coupled switched neural networks with mode-dependent impulsive effects. IEEE Trans. Neural Netw. Learn. Syst. 24(8), 1316–1326 (2013) CrossRefGoogle Scholar
- 12.Zhang, W., Tang, Y., Wu, X., Fang, J.A.: Synchronization of nonlinear dynamical networks with heterogeneous impulses. IEEE Trans. Circuits Syst. I, Regul. Pap. 61(4), 1220–1228 (2014) CrossRefGoogle Scholar
- 13.Yang, R., Wu, B., Liu, Y.: A Halanay-type inequality approach to the stability analysis of discrete-time neural networks with delays. Appl. Math. Comput. 265, 696–707 (2015) MathSciNetGoogle Scholar
- 14.Wang, J.L., Wu, H.N., Huang, T., Ren, S.Y., Wu, J.: Pinning control for synchronization of coupled reaction–diffusion neural networks with directed topologies. IEEE Trans. Syst. Man Cybern. Syst. 46(8), 1109–1120 (2016) CrossRefGoogle Scholar
- 15.Sun, C., He, W., Ge, W., Chang, C.: Adaptive neural network control of biped robots. IEEE Trans. Syst. Man Cybern. Syst. 47(2), 315–326 (2017) Google Scholar
- 16.Zhang, W., Tang, Y., Huang, T., Kurths, J.: Sampled-data consensus of linear multi-agent systems with packet losses. IEEE Trans. Neural Netw. Learn. Syst. 28(11), 2516–2527 (2017) MathSciNetCrossRefGoogle Scholar
- 17.Rakkiyappan, R., Udhayakumar, K., Velmurugan, G., Cao, J., Alsaedi, A.: Stability and Hopf bifurcation analysis of fractional-order complex-valued neural networks with time delays. Adv. Differ. Equ. 2017(1), 225 (2017) MathSciNetCrossRefGoogle Scholar
- 18.Zhang, X., Li, C., Huang, T.: Impacts of state-dependent impulses on the stability of switching Cohen–Grossberg neural networks. Adv. Differ. Equ. 2017(1), 316 (2017) MathSciNetCrossRefGoogle Scholar
- 19.Chen, X., Zhao, Z., Song, Q., Hu, J.: Multistability of complex-valued neural networks with time-varying delays. Appl. Math. Comput. 294, 18–35 (2017) MathSciNetGoogle Scholar
- 20.Shen, H., Zhu, Y., Zhang, L., Park, J.H.: Extended dissipative state estimation for Markov jump neural networks with unreliable links. IEEE Trans. Neural Netw. Learn. Syst. 28(2), 346–358 (2017) MathSciNetCrossRefGoogle Scholar
- 21.Shi, Y., Cao, J., Chen, G.: Exponential stability of complex-valued memristor-based neural networks with time-varying delays. Appl. Math. Comput. 313, 222–234 (2017) MathSciNetGoogle Scholar
- 22.Tan, Y., Tang, S., Yang, J., Liu, Z.: Robust stability analysis of impulsive complex-valued neural networks with time delays and parameter uncertainties. J. Inequal. Appl. 2017, 215 (2017) MathSciNetCrossRefMATHGoogle Scholar
- 23.Liu, Y., Xu, P., Lu, J., Liang, J.: Global stability of Clifford-valued recurrent neural networks with time delays. Nonlinear Dyn. 84(2), 767–777 (2016) MathSciNetCrossRefMATHGoogle Scholar
- 24.Zhang, D., Kou, K.I., Liu, Y., Cao, J.: Decomposition approach to the stability of recurrent neural networks with asynchronous time delays in quaternion field. Neural Netw. 94, 55–66 (2017) CrossRefGoogle Scholar
- 25.Isokawa, T., Kusakabe, T., Matsui, N., Peper, F.: Quaternion neural network and its application. In: Proc. 7th Int. Conf. KES, Oxford, UK, pp. 318–324 (2003) Google Scholar
- 26.Luo, L., Feng, H., Ding, L.: Color image compression based on quaternion neural network principal component analysis. In: Proc. Int. Conf. Multimedia Technol., pp. 1–4 (2010) Google Scholar
- 27.Kusamichi, H., Isokawa, T., Matsui, N., Ogawa, Y., Maeda, K.: A new scheme for color night vision by quaternion neural network. In: Proc. 2nd Int. Conf. Auton. Robots Agents, pp. 101–106 (2004) Google Scholar
- 28.Isokawa, T., Nishimura, H., Kamiura, N., Matsui, N.: Associative memory in quaternionic Hopfield neural network. Int. J. Neural Syst. 18(2), 135–145 (2008) CrossRefGoogle Scholar
- 29.Minemoto, T., Isokawa, T., Nishimura, H., Matsui, N.: Quaternionic multistate Hopfield neural network with extended projection rule. Artif. Life Robot. 21(1), 106–111 (2016) CrossRefGoogle Scholar
- 30.Chen, X., Song, Q., Li, Z.: Design and analysis of quaternion-valued neural networks for associative memories. IEEE Trans. Syst. Man Cybern. Syst. (2017). http://ieeexplore.ieee.org/document/7970154/. https://doi.org/10.1109/TSMC.2017.2717866 Google Scholar
- 31.Liu, Y., Zhang, D., Lou, J., Lu, J., Cao, J.: Stability analysis of quaternion-valued neural networks: decomposition and direct approaches. IEEE Trans. Neural Netw. Learn. Syst. (2017). http://ieeexplore.ieee.org/document/8088357/. https://doi.org/10.1109/TNNLS.2017.2755697 Google Scholar
- 32.Liu, Y., Zhang, D., Lu, J., Cao, J.: Global μ-stability criteria for quaternion-valued neural networks with unbounded time-varying delays. Inf. Sci. 360, 273–288 (2016) CrossRefGoogle Scholar
- 33.Shu, H., Song, Q., Liu, Y., Zhao, Z., Alsaadi, F.E.: Global μ-tability of quaternion-valued neural networks with non-differentiable time-varying delays. Neurocomputing 247, 202–212 (2017) CrossRefGoogle Scholar
- 34.Liu, Y., Zhang, D., Lu, J.: Global exponential stability for quaternion-valued recurrent neural networks with time-varying delays. Nonlinear Dyn. 87(1), 553–565 (2017) CrossRefMATHGoogle Scholar
- 35.Chen, X., Li, Z., Song, Q., Hu, J., Tan, Y.: Robust stability analysis of quaternion-valued neural networks with time delays and parameter uncertainties. Neural Netw. 91, 55–65 (2017) CrossRefGoogle Scholar
- 36.Tu, Z., Cao, J., Alsaedi, A., Hayat, T.: Global dissipativity analysis for delayed quaternion-valued neural networks. Neural Netw. 89, 97–104 (2017) CrossRefGoogle Scholar
- 37.Liu, Y., Wang, Z., Liu, X.: Global exponential stability of generalized recurrent neural networks with discrete and distributed delays. Neural Netw. 19(5), 667–675 (2006) CrossRefMATHGoogle Scholar
- 38.Zeng, Z., Huang, T., Zheng, W.X.: Multistability of recurrent neural networks with time-varying delays and the piecewise linear activation function. IEEE Trans. Neural Netw. 21(8), 1371–1377 (2010) CrossRefGoogle Scholar
- 39.Liu, X., Chen, T.: Global exponential stability for complex-valued recurrent neural networks with asynchronous time delays. IEEE Trans. Neural Netw. Learn. Syst. 27(3), 593–606 (2016) MathSciNetCrossRefGoogle Scholar
- 40.Gopalsamy, K.: Leakage delays in BAM. J. Math. Anal. Appl. 325(2), 1117–1132 (2007) MathSciNetCrossRefMATHGoogle Scholar
- 41.Li, X., Fu, X., Balasubramaniam, P., Rakkiyappan, R.: Existence, uniqueness and stability analysis of recurrent neural networks with time delay in the leakage term under impulsive perturbations. Nonlinear Anal., Real World Appl. 11(5), 4092–4108 (2010) MathSciNetCrossRefMATHGoogle Scholar
- 42.Song, Q., Zhao, Z., Li, Y.: Global exponential stability of BAM neural networks with distributed delays and reaction–diffusion terms. Phys. Lett. A 335(2), 213–225 (2005) CrossRefMATHGoogle Scholar
- 43.Zhou, J., Li, S., Yang, Z.: Global exponential stability of Hopfield neural networks with distributed delays. Appl. Math. Model. 33(3), 1513–1520 (2009) MathSciNetCrossRefMATHGoogle Scholar
- 44.Nie, X., Cao, J.: Multistability of competitive neural networks with time-varying and distributed delays. Nonlinear Anal., Real World Appl. 10(2), 928–942 (2009) MathSciNetCrossRefMATHGoogle Scholar
- 45.Chen, X., Song, Q.: Global stability of complex-valued neural networks with both leakage time delay and discrete time delay on time scales. Neurocomputing 121, 254–264 (2013) CrossRefGoogle Scholar
- 46.Song, Q., Zhao, Z.: Stability criterion of complex-valued neural networks with both leakage delay and time-varying delays on time scales. Neurocomputing 171, 179–184 (2016) CrossRefGoogle Scholar
- 47.Boyd, S., Ghaoui, L.E., Feron, E., Balakrishnan, V.: Linear Matrix Inequalities in System and Control Theory. SIAM, Philadelphia (1994) CrossRefMATHGoogle Scholar
Copyright information
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.