convergence analysis of cellular neural networks with unbounded delay
Post on 02-Apr-2017
216 Views
Preview:
TRANSCRIPT
680 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 48, NO. 6, JUNE 2001
Convergence Analysis of Cellular Neural Networkswith Unbounded Delay
Zhang Yi, Pheng Ann Heng, and Kwong Sak Leung, Senior Member, IEEE
Abstract—Cellular Neural Networks (CNNs) have been suc-cessfully applied in many areas such as classification of patterns,image processing, associative memories, etc. Since they are inher-ently local in nature, they can be easily implemented in very largescale integration. In the processing of static images, CNNs withoutdelay are often applied whereas in the processing of movingimages, CNNs with delay have been found more suitable. Thispaper proposes a more general model of CNNs with unboundeddelay, which may have potential applications in processing suchmotion related phenomena as moving images, and studies globalconvergence properties of this model. The dynamic behaviors ofCNNs, especially their convergence properties, play importantroles in applications. This paper: 1) introduces a class of CNNswith unbounded delay; 2) gives some interesting properties of anetwork’s output function; 3) establishes relationships between anetwork’s state stability and its output stability; and 4) obtainssimple and easily checkable conditions for global convergence byfunctional differential equation methods.
Index Terms—Cellular neural networks, convergence, delay, sta-bility, unbounded delay.
I. INTRODUCTION
NOW WELL-KNOWN, cellular neural networks (CNNs)were first proposed by L. O. Chua and L. Yang in 1988,
see [4], [5]. Since then, they have been widely studied both intheory and applications. They have been successfully appliedin signal processing, pattern recognition and associative memo-ries, especially in processing static images. They are inherentlylocal in nature and are easily to implement in very large scaleintegration (VLSI).
To process moving images, one must introduce delays inthe signals transmitted among the cells. This leads to themodel of CNNs with delay (DCNNs), introduced in [15],[16]. They have found applications in different areas suchas classification of patterns and reconstruction of moving im-ages. Neural networks with delays have attracted wide inter-ests from many authors in recent years, see for example [1],[6], [7], [14]–[18], [20]–[23].
In CNN applications, convergence plays an important role.For example, in the application of associative memories [8] and
Manuscript received October 5, 1999; revised September 18, 2000. Thiswork was supported in part by the Research Grants Council under Grant CUHK4306/98E of Hong Kong, and in part by the National Science Foundation ofChina under Grant 69871005. This paper was recommended by AssociateEditor P. Szolgay.
The authors are with the Department of Computer Science and Engineering,Chinese University of Hong Kong, Shatin, New Territories, Hong Kong.
Publisher Item Identifier S 1057-7122(01)04287-8.
classification of patterns, the networks must have the globalconvergence property: in image processing, they are requiredto converge to the total saturation region. The convergence ofCNNs has been investigated by many authors. Their completestability (every trajectory tends to an equilibrium) was provedin [4] to hold under the condition that the templates are sym-metric. A more rigorous proof for complete stability of CNNswas given in [20]. Some stability results for nonsymmetric tem-plates of CNNs can be found in [10], [13], [18], [21].
The convergence properties between CNNs with andwithout delay are essentially different. How does delayaffect the convergence of neural networks? For Hopfieldneural networks [3], [14], as in many other dynamical sys-tems, it is well known that delays may result in instability.They can also affect convergence. In [6], it was shown thatdelays added to a stable CNN can make it unstable. So, tostudy CNNs with delay, one must address the problem ofhow to remove this destabilizing effect.
To date, most research on DCNNs has been restricted tosimple cases of constant delays. Few papers consider variableor unbounded delay. Though delays arise frequently in prac-tical applications, it is difficult to measure them precisely.In CNNs, it is clear that a constant delay is only a specialcase. In most situations, delays are variable, and in fact un-bounded. That is, the entire history affects the present. Suchdelay terms, more suitable to practical neural nets, are calledunbounded delays.
Our model contains both variable and unbounded delays. Toour knowledge, this is the first paper for studying CNNs withunbounded delay. Since many motion-related phenomena canbe represented and/or modeled by DCNNs [16], we hope that itmay have applications in processing moving images and asso-ciative memories.
From the mathematical point of view, systems with constantdelay are different from those with variable or/and unboundeddelays, and known mathematical methods do not directly apply.This provides new challenges. Here, we develop some methodsfrom functional differential equations to study the convergenceof CNNs with unbounded delay.
This paper is organized as follows. Preliminaries are given inSection II. Section III discusses some interesting and importantproperties of output function of CNNs with unbounded delay.Section IV establishes relationships between a network’s statestability and its output stability. In Section V, we study the globalconvergence properties of CNNs with unbounded delay. Exam-ples are given in Section VI, and conclusions follow in Sec-tion VII.
1057–7122/01$10.00 © 2001 IEEE
YI et al.: CONVERGENCE ANALYSIS OF CELLULAR NEURAL NETWORKS WITH UNBOUNDED DELAY 681
Fig. 1. Output functionf .
II. PRELIMINARIES
We study the model of CNNs with unbounded delay as
(1)
where , and are constants,are continous functions, ,
are nonnegative continuous functions with, for some constant , and -tuple
denotes the state of the network at time.The network’s output function is defined as
for all (Fig. 1). Evidently, always. We useto denote the output of the network
at time .Model (1) is very general. If , then
(1) becomes a model of CNNs with variable delay. Moreover,if the are constants, (1) reduces to theDCNN model that has been widely studied. If
and , then (1) becomes thestandard CNN without delay.
For any , the initial condition of (1) is assumed to be
where are continuous functions.We say is anequilibriumof (1) if it satisfies
for .Lemma 1: The network of (1) always has equilibria.
Proof: Define a mapping as
for . Since , then, for . Obviously,
maps any bounded closed set into a bounded closed set. Thenby the Brouwer fixed point theorem [9], the maphas at leastone fixed point , with ,i.e.,
giving an equilibrium for the network (1), as required.For any , we denote
Definition 1: Suppose network (1) has an equilibrium. The equilibrium is calleduniformly stable, if
for any , there is a strictly positive constant ,independent of , such that
for all .It is clear that this definition of uniform stability is stronger
than the usual one, which requires only that for any andany there exists a such thatimplies that
for all .Definition 2: An equilibrium for network (1) isglobally
uniformly asymptotically stable, if is uniformly stable, andfor any and , there exists suchthat for any , implies that
for all .Definition 3: An equilibrium for network (1) isuniformly
output stable, if there is a constant such that for any,
for all and .Definition 4: An equilibrium for network (1) isglob-
ally uniformly asymptotically output stable, if it is uniformlyoutput stable, and for any and , there exists
such that for any ,implies that
for all .In the above definitions, (1) and (2) are about a network’s
state stability, and (3) and (4) are about its output stability. Inmany applications, it is more common to use output stabilitythan state stability.
682 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 48, NO. 6, JUNE 2001
Throughout this paper, we will use to denote the Diniderivative. For any continuous function , the Diniderivative of is defined as
It is easy to see that if is locally Lipschitz then.
III. SOME PROPERTIES OF THEOUTPUT FUNCTION
To study network convergence, we need to study the outputfunction, whose properties are crucial for analyzing the conver-gence of (1).
The piecewise linear output function is defined as
for all . It makes (1) into a class of nonlinear functionaldifferential system.
Lemma 2: If both and are in the same interval amongor or then
implies that
for all .Proof: If and are both in or both in
, then we have . In these cases the resultfollows easily.
Next, we suppose that bothand are in , thenand . We consider three cases.Case 1: If , we have , then
Case 2: If , we have , then
Case 3: If , we have , then
The proof is complete.
This is an interesting and important property of the outputfunction. It will be used in Section V to analyze the convergenceof the network.
Lemma 3: We have
for all .Proof: The result follows from the definition of.
Lemma 4: For all , we have
sign
Proof: Since is nondecreasing, the result follows di-rectly.
IV. RELATIONSHIP BETWEENSTABILITY AND OUTPUT
STABILITY
In this section, we study the relationships between a net-work’s state stability and its output stability.
At any time , the state of (1) is and theoutput of the network is . The net-work’s output at a given time is the image of its state at thattime under the mapping. That is
Since is nonlinear, the relation between the dynamic behav-iors of network’s state and network’s output is not simple.In many practical applications, the dynamic behaviors of net-work’s output are both more interesting than those of the state,and easier to measure. In any case, it is useful to understand therelationships between a network’s state stability and its outputstability. We derive such properties below.
Theorem 1: An equilibrium of network (1) isuniformly stable if and only if it is uniformly output stable.
Proof: By Lemma 3, we have
for all . Thus, necessity follows easily. Next, we provesufficiency.
Denote
for . Rewrite (1) as
(2)
YI et al.: CONVERGENCE ANALYSIS OF CELLULAR NEURAL NETWORKS WITH UNBOUNDED DELAY 683
Then, from (2) we have
(3)
Since is uniformly output stable, for anythere is such that
for all . From (3), we get
for all , where , and so
(4)
for all , where
The equilibrium is thus uniformly stable, and the proof iscomplete.
Theorem 2: An equilibrium of network (1) isglobally uniformly asymptotically stable if and only if it is glob-ally uniformly asymptotically output stable.
Proof: The necessity is easy to prove, as in Theorem 1.Since Theorem 1 shows that uniform stability of is equiv-alent to its uniform output stability, to complete the proof ofsufficiency, we only need prove that for any , there ex-ists such that for any , any ,
implies that
for all .It is easy to see from (4) that implies
for all .
Let , and choose
By the assumption that is globally uniformlyasymptotically output stable, there existssuch that for any , implies
for all .Let us choose a constant so large such that
Then, for any , we have from (3) that
and then
for all .Choose such that , and denote
, then
for all and . This completes the proof.The network’s state stability is thus equivalent to its output
stability.
684 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 48, NO. 6, JUNE 2001
V. GLOBAL CONVERGENCEANALYSIS
In this section, we study the global convergence of the CNN(1) with unbounded delay. Convergence analysis aims to findrelatively weak, simple and checkable criteria for a network’sconvergence, which plays an important role in many applica-tions.
The conditions in Theorem 3 are quite simple, and are easy tocheck. They are totally independent of any delays in the network(1).
Theorem 3: Suppose
for , where
ifif .
Then, the network (1) has only one equilibrium, which is glob-ally uniformly asymptotically output stable.
Proof: Let us first derive an inequality which will be usedmany times in the proof. By Lemma 1, (1) has at least one equi-librium . Rewrite (1) in the form (2). From(2) by Lemmas 3 and 4, we have
(5)
The proof of the theorem is in two parts. In the first part, weprove that the equilibrium is uniformly output stable. Thatis, we prove that for any
(6)
for all . The proof of this assertion is as follows.Suppose that is not uniformly output stable. Then there
exists some and and such that
Fig. 2. jf(x (t)) � f(x )j.
and is strictly monotone increasing on the interval. (See Fig. 2 for an intuitive explanation.) Moreover,
for all , while from (5) it follows that
where . By continuity, theremust exist a such that
for all . Moreover, by continuity of the solutions ofnetwork (1), we can choose so close to such that for any
, both and are in one of the three intervals, and . Then, by Lemma 2, we have
for all . This contradicts the fact that isstrictly monotone increasing on the interval , and so (6)holds. The proof of uniform output stability is complete.
In the second part of the proof, we show that for anyand , there is a such that for any
, implies
(7)
for all . This requires several steps.First, it is clear from (6) that implies
, for and all . By The-orem 1, there exists a constant such that ,for and all .
Define
YI et al.: CONVERGENCE ANALYSIS OF CELLULAR NEURAL NETWORKS WITH UNBOUNDED DELAY 685
and
and choose a constant such that
Let be the first nonnegative integer such that ,and take
where is a nonnegative integer and
Now, we are in a position to prove (7). We first use mathe-matical induction to prove that implies
(8)
for all . Clearly, this will imply (7).Obviously, (8) holds for . Suppose (8) holds for some
, i.e.,
(9)
for all . We shall prove in two steps that
(10)
for all .Firstly, we prove there exists a such that
(11)
Otherwise, there exists somesuch that
(12)
for all . Then, from (9) and (12), we have
(13)
for all and . From (5), (12)and (13), it follows for all that
(14)
and so
This contradiction shows that (11) holds.To complete the proof of (10), we prove
(15)
for all . If not, there must exist a and some such that
and
(16)
for all . Then, from (9)
for all . Similarly to the calculation of (14), we can showthat . By continuity of the solutions of (1), there
686 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 48, NO. 6, JUNE 2001
must exist a such that both and are inone of the three intervals , and , and
By Lemma 2, we have
which contradicts (16). This proves (15) to be true.Note that . Then (10) follows from (15), and so (8)
holds.Taking in (8), we obtain implies
for all , where is independent of . Thisproves (7).
The above proof shows the equilibrium is globally uni-formly asymptotically output stable. In this situation, the net-work cannot have any other equilibrium. Then, the proof is com-plete.
Corollary: Consider a CNN with variable delays
(17)
for and , where the delaysare assumed to any bounded continuous functions. If
(with as before) we have
then, the network (17) has only one equilibrium and this equi-librium is globally uniformly asymptotically stable.
Obviously, the above stability conditions are independent ofthe delays. This property is very useful in practical applicationssince the delays in neural networks are actually not easy to beknown exactly in practice. Recently, in [11], [12], stability ofneural networks with a variable and differentiable delay wasstudied. The stability conditions given in [11], [12] depend onthis delay.
VI. EXAMPLES
In this section, we will give some examples to illustrate ourtheory. Let us first consider a case with some variable delays.
Fig. 3. Three-dimensional phase space(x ; x ; x ).
Example 1: Consider a CNN with variable delays
(18)
where the delays 1, 2, 3) are assumed to beany bounded continuous functions. Such delays could be uncer-tain. Using the Corollary above, it is clear that this network hasonly one equilibrium, which is globally uniformly asymptoti-cally stable. It can be calculated to be at0.5, 1.5, 0.5). How-ever, the global convergence of (18) can not be checked by anypreviously known criteria, say, stability conditions in [11], [12].Simulations of the convergence of (18) in the case of
1, 2, 3) are shown in Fig. 3.Example 2: Consider a two- dimensional CNN with un-
bounded delay
for . Noting that
the stability conditions in Theorem 3 are satisfied. Then, thisnetwork has one unique equilibrium and this equilibrium isglobally uniformly asymptotically output stable. The equilib-rium can be found out to be .
YI et al.: CONVERGENCE ANALYSIS OF CELLULAR NEURAL NETWORKS WITH UNBOUNDED DELAY 687
VII. CONCLUSIONS
It is well known that delays often appear in artificial neuralnetworks, though to identify them is not easy: apparently con-stant delay is only an ideal and simplified case. In most situa-tions, delays are variable and may extend over all the past.
Since delays may change dynamic behavior, it is important toanalyze the convergence of networks with delay. In this paper,we have studied a model of CNNs which contains variable,unbounded delays and is thus more general. It may have ap-plications in processing of motion related phenomena such asmoving images and associative memories. We have establishedrelationships between a network’s state stability and its outputstability, deriving simple, checkable conditions for global con-vergence. These stability conditions depend only on the net-work’s coefficients, and are totally independent of the delays.We are currently investigating applications for this model.
ACKNOWLEDGMENT
The authors wish to thank the reviewers for giving manyhelpful comments and suggestions, and Dr. T. Poston for hiscareful proof reading and editing of the manuscript.
REFERENCES
[1] S. Arik and V. Tavsanoglu, “Equilibrium analysis of delayed CNNs,”IEEE Trans. Circuits Syst. I, vol. 45, pp. 168–171, Feb. 1998.
[2] , “On the global asymptotic stability of delayed cellular neural net-works,” IEEE Trans. Circuits Syst. I, vol. 47, pp. 571–574, Apr. 2000.
[3] P. Baldi and A. F. Atiya, “How delays affect neural dynamics andlearning,” IEEE Trans. Neural Networks, vol. 5, pp. 612–621, Apr.1994.
[4] L. O. Chua and L. Yang, “Cellular neural networks: Theory,”IEEETrans. Circuits Syst., vol. 35, pp. 1257–1272, Oct. 1988.
[5] , “Cellular neural networks: Applications,”IEEE Trans. CircuitsSyst., vol. 35, pp. 1273–1290, Oct. 1988.
[6] P. P. Civalleri, M. Gilli, and L. Pandolfi, “On the stability of cellularneural networks with delay,”IEEE Trans. Circuits Syst. I, vol. 40, pp.157–165, Mar. 1993.
[7] M. Gilli, “Stability of cellular neural networks and delayed neural net-works with nonpositive templates and nonmonotonic output functions,”IEEE Trans. Circuits Syst. I, vol. 41, pp. 518–528, Aug. 1994.
[8] G. Grassi, “A new approach to design cellular neural networks for as-sociative memories,”IEEE Trans. Circuits Syst. I, vol. 44, pp. 835–838,Sept. 1998.
[9] M. W. Hirsch, Differential Topology, Graduate Texts in Mathe-matics. New York: Springer-Verlag, 1976, vol. 33.
[10] M. Forti and A. Tesi, “New conditions for global stability of neural net-works with application to linear and quadratic programming problems,”IEEE Trans. Circuits Syst. I, vol. 42, pp. 354–366, Sept. 1995.
[11] M. Joy, “On the global convergence of a class of functional differentialequations with applications in neural network theory,”J. Math. Anal.and Appl., vol. 232, no. 1, pp. 61–81, 1999.
[12] , “Results concerning the absolute stability of delayed neural net-works,” Neural Netw., vol. 13, no. 6, pp. 613–616, 2000.
[13] M. P. Joy and V. Tavsanoglu, “An equilibrium analysis of CNNs,”IEEETrans. Circuits Syst. I, vol. 45, pp. 94–98, Jan. 1998.
[14] C. M. Marcus and R. M. Westervelt, “Stability of analog neural networkswith delay,”Phys. Rev. A, vol. 39, pp. 347–359, 1989.
[15] T. Roska and L. O. Chua, “Cellular neural networks with delay type tem-plate elements and nonuniform grids,”Int. J. Circuit Theory Applicat.,vol. 20, no. 4, pp. 469–481, 1992.
[16] T. Roska, C. W. Wu, M. Balsi, and L. O. Chua, “Stability and dynamicsof delay-type general and cellular neural networks,”IEEE Trans. Cir-cuits Syst. I, vol. 39, pp. 487–490, June 1992.
[17] N. Takahashi, “A new sufficient condition for complete stability of cel-lular neural networks with delay,”IEEE Trans. Circuits Syst. I, vol. 47,pp. 793–799, June 2000.
[18] N. Takahashi and L. O. Chua, “A new sufficient condition for nonsym-metric CNNs to have a stable equilibrium point,”IEEE Trans. CircuitsSyst. I, vol. 44, pp. 1092–1095, Nov. 1998.
[19] P. L. Venetianer and T. Roska, “Image compression by delayed CNNs,”IEEE Trans. Circuits Syst. I, vol. 45, pp. 205–215, Mar. 1998.
[20] C. W. Wu and L. O. Chua, “A more rigorous proof of complete stabilityof cellular neural networks,”IEEE Trans. Circuits Syst. I, vol. 44, pp.370–371, Apr. 1997.
[21] Z. Yi, P. A. Heng, and A. W. Fu, “Estimate of exponential convergencerate and exponential stability for neural networks,”IEEE Trans. NeuralNetworks, vol. 10, no. 6, pp. 1487–1493, 1999.
[22] Z. Yi, “Global exponential stability and periodic solutions of delay Hop-field neural networks,”Int. J. Syst. Sci., vol. 27, no. 2, pp. 227–231, 1996.
[23] Z. Yi, S. M. Zhong, and Z. L. Li, “Periodic solutions and stability ofHopfield neural networks with variable delays,”Int. J. Syst. Sci., vol.27, no. 9, pp. 895–901, 1996.
Zhang Yi received the B.S. degree in mathematicsfrom Sichuan Normal University, Chengdu, China,in 1983, the M.S. degree in mathematics fromHebei Normal University, Shijiazhuang, China, in1986, and the Ph.D. degree in mathematics from theInstitute of Mathematics, The Chinese Academy ofScience, Beijing, China, in 1994.
From 1989 to 1990, he was a Senior VisitingScholar in the Department of Automatic Control andSystems Engineering, The University of Sheffield,Sheffield, U.K.. He is currently a Professor at the
Department of Computer Science and Engineering, University of ElectronicScience and Technology of China, Chdngdu, China, and is involved in researchin the Department of Computer Science and Engineering, The ChineseUniversity of Hong Kong, Hong Kong. He is an author or co-author of morethan 70 journal papers. His current research interests include computationalintelligence and data mining.
Pheng Ann Heng received the B.Sc. degree fromthe National University of Singapore, Singapore, in1985, and the M.Sc. degree in computer science, theM.Art degree in applied mathematics, and the Ph.D.degree in computer science, all from the IndianaUniversity, Bloomington, in 1987, 1988, and 1992respectively.
He is an Associate Professor at the Department ofComputer Science and Engineering at The ChineseUniversity of Hong Kong, Hong Kong, where he isalso the Director of the Virtual Reality, Visualization
and Imaging Research Centre. His research interests include virtual reality ap-plications in medicine, scientific visualization, 3-D medical imaging, user inter-face, rendering and modeling, interactive graphics and animation.
Kwong Sak Leung (M’77–SM’89) received theB.Sc. (eng.) and Ph.D. degrees in 1977 and 1980,respectively, from the University of London, QueenMary College, London, U.K.
He was a Senior Engineer on contract R andD at ERA Technology, Leatherhead, U. K., andlater joined the Central Electricity GeneratingBoard, London, U.K., to work on nuclear powerstation simulators. In 1985, he joined the ComputerScience and Engineering Department at the ChineseUniversity of Hong Kong, Hong Kong, where he is
currently Professor and Chairman of the department. His research interests arein soft computing including evolutionary computation, neural computation,probabilistic search, information fusion and data mining, fuzzy data andknowledge engineering. He has published over 140 papers and two books onfuzzy logic and evolutionary computation.
Dr. Leung has been Chair and member of many program and organizing com-mittees of international conferences, and is on the Editorial Board ofFuzzy Setsand Systems. He is a Chartered Engineer, a member of the Institute of ElectronicEngineers, and the Association for Computing Machinery, and a Fellow of theHong Kong Computer Society.
top related