李志鴻,何秀麗
(河海大學 理學院,江蘇 南京 210098 )
?
基于馬氏切換Cohen-Grossberg神經(jīng)網(wǎng)絡輸入到狀態(tài)穩(wěn)定的新判據(jù)
李志鴻,何秀麗
(河海大學 理學院,江蘇 南京 210098 )
摘要:針對具有馬氏切換的Cohen-Grossberg神經(jīng)網(wǎng)絡的穩(wěn)定性問題,應用Lyapunov函數(shù)理論和Halanay不等式技巧,提出了神經(jīng)網(wǎng)絡輸入到狀態(tài)穩(wěn)定性的新判據(jù),并通過數(shù)值例子來驗證所得結論的正確性和有效性.
關鍵詞:Cohen-Grossberg神經(jīng)網(wǎng)絡; 輸入到狀態(tài)穩(wěn)定; 馬氏切換;Halanay微分不等式
1983年,Cohen和Grossberg提出了一種神經(jīng)網(wǎng)絡模型[1],被稱為Cohen-Grossberg神經(jīng)網(wǎng)絡,由于其廣泛應用在不同領域,如模式識別、記憶與信號處理、圖象處理與計算技術等領域,因而得到了高度的重視.但是在實際應用中輸入、切換是不可避免的,這些因素往往導致整個系統(tǒng)的震蕩,甚至不穩(wěn)定. 因此有關輸入[2-4]、切換[5-6]Cohen-Grossberg神經(jīng)網(wǎng)絡的研究已逐漸引起學者的關注. 最常見的切換就是馬氏切換[7,10],研究帶有馬氏切換的神經(jīng)網(wǎng)絡的穩(wěn)定性問題具有極其重要的意義. 本文推廣了Halandy不等式[8],構造了更一般的Lyapunov函數(shù),得出的主要結果改進和推廣了已有文獻的一些結論[9-10],不但推導出模型指數(shù)輸入到狀態(tài)穩(wěn)定的新判別準則,而且給出解的指數(shù)收斂速度的估計.
1模型和預備知識
考慮Cohen-Grossberg神經(jīng)網(wǎng)絡模型
(1)
其中,k=1,2,…,n,n 表示網(wǎng)絡中神經(jīng)元的個數(shù),xk(t)表示第k個神經(jīng)元在t時刻的狀態(tài),hk表示放大函數(shù),dk表示適當?shù)男袨楹瘮?shù),A(i)=(akl(i))n×n和B(i)=(bkl(i))n×n分別表示連接權矩陣和時滯連接權矩陣,fl為神經(jīng)元的激勵函數(shù),uk(i,t)表示網(wǎng)絡的一些外部輸入,τ(t)表示時滯,且滿足條件0≤τ(t)≤τ,τ′(t)≤ρ<1,τ和ρ都是正常數(shù).
LV(x(t),t,i)=Vt(x(t),t,i)+Vx(x(t),t,i)×{H(x(t))[-D(i,x(t))+
要求神經(jīng)網(wǎng)絡模型(1)滿足以下假設
定義1[9]若存在一個KL—函數(shù)β:R+×R+→R+和一個K—函數(shù)γ(·)使得x(t;ξ,u(t))滿足
|x(t;ξ,u(t))|≤β(t,‖ξ‖τ)+γ(‖u‖∞)
則稱系統(tǒng)(1)是輸入到狀態(tài)穩(wěn)定的.
定義2[9]若存在一個常數(shù)λ>0和2個K—函數(shù)β,γ使得x(t;ξ,u(t))滿足
|x(t;ξ,u(t))|≤β(‖ξ‖τ)exp(-λt)+γ(‖u‖∞)
則稱系統(tǒng)(1)是指數(shù)輸入到狀態(tài)穩(wěn)定的.
定義3[11]設g是R→R的連續(xù)函數(shù),g的Dini-導數(shù)可表示為
2穩(wěn)定性分析
D+V(r(t),x(t))≤-α1V(r(t),x(t))+α2V(r(t-τ),x(t-τ))+γ(|u|[0,t]),
(2)
那么
(3)
如果上式不成立,則必然存在t*使得
由定義3,有
x(t*))≤-α1EV(r(t*),x(t*))+α2EV(r(t*-τ),x(t*-τ))+(α1-α2)γt*.
(7)
由式(5)有,EV(r(t*),x(t*))-γt*=V(r(0),x(0))exp(-λ*t*). 又由式(4)有,W(r(t*-τ),x(t*-τ))≤H(r(t*-τ),x(t*-τ))=V(r(0),x(0))exp(-λ*(t*-τ))=V(r(0),x(0))exp(λ*τ)exp(-λ*t*),即EV(r(t*-τ),x(t*-τ))-γt*-τ≤V(r(0),x(0))exp(λ*τ)exp(-λ*t*). 因此,把以上2式分別代入式(7)得
D+W(r(t*),x(t*))≤-α1[EV(r(t*),x(t*))-γt*]+α2[EV(r(t*-τ),x(t*-τ))-γt*]≤-
α1V(r(0),x(0))exp(-λ*t*)+α2V(r(0),x(0))exp(λ*τ)exp(-λ*t*)(-α1+α2exp(λ*τ)V(r(0),x(0))exp(-λ*t*))<-
λ*V(r(0),x(0))exp(-λ*t*))=D+H(t*).
與式(6)矛盾, 故式(3)成立.
證畢.
定理1在假設1~4之下,如果存在正定矩陣S(l)和正對角矩陣P(l)=diag{p1(l),p2(l),…,pn(l)},Q(l)=diag{q1(l),q2(l),…,qn(l)}, 使得
并且c1δ2>c2ητ2,那么系統(tǒng)(1)是輸入到狀態(tài)穩(wěn)定的,其收斂率至少是λ/2,其中
證明考慮如下Lyapunov函數(shù)
(8)
由式(8)可得
c1|x|2≤V(x(t),l)≤c2|x|2,
(9)
根據(jù)假設1~4以及LV的定義,有
LV(x(t),l)=2xT(t)S(l)[H(x(t))(-D(l,x(t))+A(l)f(x(t))+B(l)f(x(t-τ(t)))+u(l,t))]+
2xT(t)S(l)[H(x(t))(-D(l,x(t))+A(l)f(x(t))+B(l)f(x(t))+u(l,t))]+
2fT(x(t))Q(l)[H(x(t))(-D(l,x(t))+A(l)f(x(t))+B(l)f(x(t))+u(l,t))]+
2xT(t)S(l)H(x(t))B(l)(f(x(t-τ(t))-f(x(t)))+2fT(x(t))Q(l)H(x(t))B(l)×
(xT(t),fT(x(t)))W(l)(x(t),f(x(t)))+δ|x(t)|2+2δ|f(x(t))|2+
由式(1)可得
利用廣義Ito公式把上式代入式(10)可得
根據(jù)引理1,得
.
證畢.
3數(shù)值例子
考慮帶有時滯的二階Cohen-Grossberg神經(jīng)網(wǎng)絡系統(tǒng)
h1(x1(t))=0.9+0.1sinx1(t),h2(x2(t))=0.7+0.3cosx2(t),
取τ(t)=0.8≤τ=1,可以計算出
c1=2,c2=2.22,δ=4.15,α1=1.87,α2=0.32.
顯然,α1>α2, 滿足引理1的所有條件.模擬表明,圖1是輸入到狀態(tài)穩(wěn)定的;當沒有輸入時,圖2也是穩(wěn)定的,因此系統(tǒng)(1)是輸入到狀態(tài)穩(wěn)定的.
參考文獻:
[1] Cohen M A, Grossberg S. Absolute stability of global pattern formulation and parallel memory storage by competitive neural networks [J].IEEE Trans.Syst.Man Cybern.SMC, 1983,13(4):815-826.
[2] Yang Z C, Zhou W S, Huang T W. Exponential input-to-state stability of recurrent neural networks with multiple time-varying delays [J].Cogn Neurodyn,2014,8(9):47-54.
[3] Zhu Q X, Cao J D, Rakkiyappan R. Exponential input-to-state stability of stochastic cohen-grossberg neural networks with mixed delays [J].Springer Nonlinear Dyn,2015,79(2):1 085-1 098.
[4] Zhu S, Shen Y. Two algebraic criteria for input-to-state stability of recurrent neural networks with time-varying delays [J]. Neural Comput Applic,2013,22(9):1 163-1 169.
[5] Choon K A. Passive learning and input-to-state stability of switched Hopfield neural networks with time-delay [J]. Information Sciences,2010,180:4 582-4 594.
[6] Xu Y, Luo W W, Zhong K, et al. Mean square input-to-state stability of a general class of stochastic recurrent neural networks with markovian switching [J].Neural Comput Applic,2014,25(2):1 657-1 663.
[7] Huang H, Ho D W C, Qu Y Z. Robust stability of stochastic delayed additive neural networks with markovian switching [J]. Neural Netw, 2007, 20(5):799-809.
[8] Halanay A. Differential equations: stability, oscillation, timelags [M].New York: Academic,1966.
[9] Zhou W S, Teng L Y, Xu D Y. Mean-square exponentially input-to-state stability of stochastic cohen-grossberg neural networks with time-varying delays [J].Neurocomputing,2015,153(4):54-61.
[10] Shen Y, Wang J. Almost sure exponential stability of recurrent neural networks with markovian switching [J]. IEEE Transactions on neural networks,2009,20(5):840-855.
[11] Liao X X, Luo Q, Zeng Z G, et al. Global exponential stability in Lagrange sense for recurrent neural networks with time delays [J].Nonlinear Analysis: Real World Applications,2008,9(3):1 535-1 557.
[12] Mao X, Yuan C. Stochastic differential equations with markovian switching [M].London: Imperial College Press,2006.
Input to State Stability Based on Cohen-Grossberg Neural Network with Markovian Switching
Li Zhihong, He Xiuli
(College of Science, Hehai University, Nanjing 210098, China)
Abstract:Aimed at the stability of Cohen Grossberg neural networks with Markovian switching, in the report, Lyapunov-Krasovskii functional method and Halanay differential inequalities technique were used, and the new criteria on the input to the state stability of neutral network was proposed. Lastly, the numerical examples were used to illustrate the correctness and efficiency of the results.
Keywords:input to state stability; Cohen-Grossberg neural networks; Halanay differential inequalities; Markovian switching
收稿日期:2015-11-04
基金項目:中央高??蒲袠I(yè)務費青年教師科研創(chuàng)新能力培育項(A)(2015B19814)
作者簡介:李志鴻(1992-), 女, 江蘇泰州人,河海大學2014級碩士研究生, 研究方向:神經(jīng)網(wǎng)絡的穩(wěn)定性分析,E-mail:1353285605@qq.com. 通信作者: 何秀麗(1979-), 女, 湖北黃岡人,博士研究生, 講師, 研究方向:隨機動力系統(tǒng),E-mail:hexiu00@163.com
文章編號:1004-1729(2016)01-0012-07
中圖分類號:O175
文獻標志碼:ADOl:10.15886/j.cnki.hdxbzkb.2016.0003