analysis of associative memories based on stability of cellular neural networks with time delay
Post on 12-Dec-2016
212 Views
Preview:
TRANSCRIPT
ORIGINAL ARTICLE
Analysis of associative memories based on stability of cellularneural networks with time delay
Qi Han • Xiaofeng Liao • Chuandong Li
Received: 29 October 2011 / Accepted: 5 January 2012 / Published online: 20 January 2012
� Springer-Verlag London Limited 2012
Abstract In the paper, associative memories based on
cellular neural networks with time delay are presented. In
some previous papers, the relationship between cloning
templates is closer and stronger. Therefore, some methods
are used to make the relationship loose. First, some theories
on stability of cellular neural networks are given. Then,
associative memories based on cellular neural networks are
given on the basis of these theories. In addition, a design
procedure of associative memories is introduced. Finally,
some examples are given to verify the theoretical results
and design procedures.
Keywords Cellular neural networks � Associative
memories � Cloning template � Time delay
1 Introduction
Cellular neural networks (CNNs) were first introduction
in 1988 [1, 2]. CNNs are well fit for very large-scale
integration implementations due to this local intercon-
nection property and have found many applications in a
variety of areas, such as image processing [3], pattern
recognition [4], medical diagnosis [5], and associative
memories [6]. In this paper, we mainly discuss the
application of CNNs with time delay in associative
memories. At its simplest, an associative memory is a
system that stores mappings from specific input patterns
to specific output patterns. That is to say, a system which
‘‘associates’’ two patterns is that when one of two patterns
is presented, the other can be reliably recalled. There
are two kinds of associative memory: auto-associative
memories and hetero-associative memories.
Since Liu and Michel [7] reported that CNNs are
effective as an associative memories medium, associative
memories have received a great deal of interest. Next, we
would introduce some papers about associative memories
according to the time sequence. Sparsely interconnected
neural networks for associative memories were presented
in [8], and sparse synthesis technique was applied to the
design of a class of CNNs. A design algorithm for CNNs
with space-invariant cloning template with applications
to associative memories was presented in [9]. A synthesis
procedure for associative memories using discrete-time
CNNs (DTCNNs) with learning and forgetting capabili-
ties is presented in [10]. A synthesis procedure of CNNs
for associative memories was introduced in [11], where
the method assured the global asymptotic stability of the
equilibrium point. DTCNNs with a globally asymptoti-
cally stable equilibrium point were designed to behave as
associative memories in [12]. In last 10 years, associative
memories were achieved by local stability of equilibrium
points of CNNs. In [13–15], the number of memory
patterns of CNNs, which were locally exponentially
stable was obtained, and associative memories based on
CNNs were designed. A design method for synthesizing
associative memories based on discrete-time recurrent
neural networks was presented in [16]. In [17], a new
design procedure for synthesizing associative memories
based on CNNs with time delays characterized by input
and output matrices was introduced. In addition, in
[18, 19], associative memories based on neural networks
are presented.
Q. Han (&) � X. Liao � C. Li
State Key Laboratory of Power Transmission Equipment
and System Security, College of Computer Science,
Chongqing University, Chongqing 400030, China
e-mail: yiding1981@yahoo.com.cn
123
Neural Comput & Applic (2013) 23:237–244
DOI 10.1007/s00521-012-0826-4
From the above introduction about associative memo-
ries, it is easy to know that stability of CNNs play an
important role in associative memories. In the paper, the
outputs of CNNs are thought as memory patterns of asso-
ciative memories. Only when CNNs are stable, the outputs
of CNNs are fixed. If a CNN are not stable, the outputs will
be not stable, and memory patterns will be not obtained by
the CNN. Therefore, we should get the conditions of sta-
bility of CNNs for achieving associative memories. There
have been abundant researches about stability of CNNs
[20]. Some sufficient conditions for CNNs to be stable
were obtained by constructing Lyapunov functional
[21–28], and these conditions generally made equilibrium
point global asymptotically stable. However, some authors
present some conditions that made equilibrium points
locally stable, and there generally were multiple equilib-
rium points [13–15, 29–31]. In addition, during hardware
implementation, time delays occur due to finite switching
speed of the amplifiers and communication time. Time
delay may lead to an oscillation and furthermore, to
instability of networks. Therefore, the study of stability of
CNNs with time delay is practically required. There have
existed many papers about CNNs with time delay [13–15,
17, 22–32].
In previous papers, the researches for associative mem-
ories based on CNNs are not very comprehensive. For
example, bias vectors were computed by one of all memory
patterns in [7–9]; the relations between cloning templates
were stronger and closer in [13–17]. Thus, we give a new
method to achieve associative memories. The relations
between cloning templates are weaken through returning to
zero for initial states of all cells of a CNN. Our methods can
be used in auto-associative memories and hetero-associative
memories. The design procedures of associative memories
based on CNNs are given on the basis of some new theories.
The remaining parts of this paper are organized as
follows. In Sect. 2, a class of CNNs with time delay is
given. In Sect. 3, the main results are shown. First, a
theorem and its corollary are obtained. Then, some
methods of designing associative memories are given.
A design procedure on associative memories is given. In
Sect. 5, some examples are given to verify the theoretical
results and design procedures. Some conclusions are
finally drawn in Sect. 6.
2 Preliminaries
Consider CNNs whose cells are arranged on a rectan-
gular array composed of N rows and M columns, where
CNNs are defined by the following delay differential
equations:
_yij tð Þ ¼ ��cijyij tð Þ þXk2 i;rð Þ
k¼k1 i;rð Þ
Xl2 j;rð Þ
l¼l1 j;rð Þ�aklgiþk;jþl y tð Þð Þ
þXk2 i;rð Þ
k¼k1 i;rð Þ
Xl2 j;rð Þ
l¼l1 j;rð Þ
�bklgiþk;jþl y t� sð Þð Þ þ gij; t� 0;
gij ¼Xk2 i;rð Þ
k¼k1 i;rð Þ
Xl2 j;rð Þ
l¼l1 j;rð Þ
�dkl�ukl þ �vij;
y tð Þ ¼ u tð Þ; �s� t\0;
8>>>>>>>>>>>>>>><
>>>>>>>>>>>>>>>:
ð1Þ
where yijðtÞ 2 R denotes the states vector, �cij is a positive
parameter, r is positive integer denoting neighborhood
radius, �A ¼ ð�aklÞð2rþ1Þ�ð2rþ1Þ 6¼ 0 is the feedback cloning
template, �B ¼ ð�bklÞð2rþ1Þ�ð2rþ1Þ is the delay feedback cloning
template, s\? is the time delay, �D ¼ ð�dklÞð2rþ1Þ�ð2rþ1Þ is
input cloning template, �ukl is the input, �vij is the
bias, k1ði; rÞ ¼ max f1� i;�rg; k2ði; rÞ ¼ minfN � i; rg;l1ðj; rÞ ¼ maxf1� j;�rg; l2ðj; rÞ ¼ minfM � j; rg; and g(�)is the activation function defined by
g yð Þ ¼ yþ 1j j � y� 1j jð Þ=2:
Let r = 1, then we denote the expression of template �A
and �B as follows
�A ¼�a�1;�1 �a�1;0 �a�1;1
�a0;�1 �a0;0 �a0;1
�a1;�1 �a1;0 �a1;1
2
64
3
75 and
�B ¼�b�1;�1
�b�1;0�b�1;1
�b0;�1�b0;0
�b0;1
�b1;�1�b1;0
�b1;1
2
64
3
75:
Choose n = N 9 M, then system (1) can be put in
vector form as
_x tð Þ ¼ �Cx tð Þ þ Af x tð Þð Þ þ Bf x t � sð Þð Þ þ DU þ V ; ð2Þ
where x = (x1, x2, …, xn)T = (y11, y12, …, y1M, …, yNM)T,
coefficient matrices A, B, and D are obtained through the
templates �A, �B and �D, C = diag(c1, …, cn), the input
vector U = (u1, …, un)T, the bias vector V = (v1, …, vn)T
and activation function f(x) = (g(y1), …, g(yn))T. The kth
cell in (2) is denoted by Ok (k = iN ? j, where 1 B i B N,
1 B j B M, i denotes ith row and j denotes jth column of
the CNN). The matrix A = (aij)n9n, defined by (2),
composed of template has the form
238 Neural Comput & Applic (2013) 23:237–244
123
A1 A2 0 0 . . . 0 0
A3 A1 A2 0 . . . 0 0
0 A3 A1 A2 . . . 0 0
0 0 A3 A1 . . . 0 0
..
. ... ..
. ... . .
. ... ..
.
0 0 0 0 0 A1 A2
0 0 0 0 0 A3 A1
26666666664
37777777775
n�n
;
A1 ¼
�a00 �a01 0 � � � 0 0
�a0;�1 �a00 �a01 � � � 0 0
0 �a0;�1 �a00 � � � 0 0
..
. ... ..
. . .. ..
. ...
0 0 0 � � � �a00 �a01
0 0 0 � � � �a0;�1 �a00
266666664
377777775
M�M
;
A2 ¼
�a10 �a11 0 � � � 0 0
�a1;�1 �a10 �a11 � � � 0 0
0 �a1;�1 �a10 � � � 0 0
..
. ... ..
. . .. ..
. ...
0 0 0 � � � �a10 �a11
0 0 0 � � � �a1;�1 �a10
266666664
377777775
M�M
and
A3 ¼
�a�1;0 �a�1;1 0 � � � 0 0
�a�1;�1 �a�1;0 �a�1;1 � � � 0 0
0 �a�1;�1 �a�1;0 � � � 0 0
..
. ... ..
. . .. ..
. ...
0 0 0 � � � �a�1;0 �a�1;1
0 0 0 � � � �a�1;�1 �a�1;0
2
66666664
3
77777775
M�M
:
The definition of matrices B = (bij)n9n and D = (dij)n9n
is similar to A.
Let a ¼ a1; a2; . . .; anð ÞT2 !n ¼ xi 2 Rnjxi ¼ 1 or xi ¼f�1; i ¼ 1; 2; . . .; ng; C að Þ ¼ x2Rnjxiai[1;i¼f 1;2;...;ng:Then, for xðtÞ2Cða0Þ;xðt�sÞ2Cða00Þ, the (2) can be
rewritten as
_xðtÞ ¼ �CxðtÞ þ Aa0 þ Ba00 þ DU þ V : ð3Þ
If b is an equilibrium point of (3), then we have
b ¼ C�1ððAþ BÞaþ DU þ VÞ 2 C að Þ; ð4Þ
where a [ Tn.
Lemma 1 [7] Suppose a = (a1, a2, …, an)T [ Tn. If
b = (b1, b2, …, bn)T = C-1((A ? B)a ? DU ? V) [C(a), then b is an asymptotically stable equilibrium point
of (2).
Proof Equation (3) has a unique equilibrium point at
xe = C-1((A ? B)a ? DU ? V) and xe = b [ C(a) by
assumption. Therefore, this equilibrium is also asymptoti-
cally stable, since (3) has all its n eigenvalues at -ci, i =
1, 2, …, n. h
3 Main result
In this section, we will give some theories about stability of
CNNs with time delay firstly. Then, some methods are
obtained on the basis of these theories for associative
memories based on CNNs.
3.1 Stability of CNNs with time delay
The (3) can be rewritten as
_xi ¼ �cixi þXn
j¼1
aija0j þXn
j¼1
bija00j þ
Xn
j¼1
dijuj þ vi: ð5Þ
Theorem 1 In (5), let xi (0) = 0.
(i) IfPn
j¼1 ðaij þ bijÞa þPn
j¼1 dijuj þ vi [ ci; a 2 !n,
then the (5) converges to a positive stable equilib-
rium point, and the positive equilibrium point is
bigger than 1.
(ii) IfPn
j¼1 ðaij þ bijÞaþPn
j¼1 dijuj þ vi\� ci; a 2 !n,
then the (5) converges to a negative stable equilib-
rium point, and the negative equilibrium point is less
than negative 1.
Proof In (5), there exists a unique equilibrium point
bi ¼Xn
j¼1
ðaij þ bijÞaj þXn
j¼1
dijuj þ vi
!,ci: ð6Þ
(i) IfPn
j¼1 ðaij þ bijÞaj þPn
j¼1 dijuj þ vi [ ci and
xi (0) = 0, we have bi [ 1 in (6). Therefore, whenPnj¼1 ðaij þ bijÞaj þ
Pnj¼1 dijuj þ vi [ ci, then the
(5) converges to a positive stable equilibrium point
by Lemma 1, and the positive equilibrium point is
bigger than 1.
(ii) IfPn
j¼1 ðaij þ bijÞaj þPn
j¼1 dijuj þ vi\� ci and
xi (0) = 0, we have bi \ 1 in (6). Therefore, whenPnj¼1 ðaij þ bijÞaj þ
Pnj¼1 dijuj þ vi\� ci, then the
(5) converge to a negative stable equilibrium point
by Lemma 1, and the negative equilibrium point is
less than negative 1.
If we choose vi = 0 in Theorem 1, then we can get the
following corollary.
Corollary 1 In (5), let xi (0) = 0 and vi = 0
(i) IfPn
j¼1 ðaij þ bijÞaj þPn
j¼1 dijuj [ ci, then the (5)
converges to a positive equilibrium point, and the
positive equilibrium point is bigger than 1.
Neural Comput & Applic (2013) 23:237–244 239
123
(ii) IfPn
j¼1 ðaij þ bijÞaj þPn
j¼1 dijuj\� ci, then the (5)
converges to a negative equilibrium point, and the
negative equilibrium point is less than negative 1. h
Remark 1 In the Theorem 1 and its corollary, though the
initial states of a CNN have to return to zero, our methods
does not strictly limit the relations between cloning tem-
plates of the CNN. For example,Pn
j¼1 aij þ bij
�� ��� �\ci\1
andP1
i¼�1
P1i¼�1 �aij þ �bij
�� ��� �\1 in [15, 16]; however, our
theories do not have these limitations.
3.2 Notations
Suppose that there exists a set of memory patterns, and
these memory patterns can be written as a matrix C = (a1,
a2, …, am), where ai ¼ ai1; a
i2; . . .; ai
n
� �T2 !n and ai is a set
of outputs (a memory pattern) of all cells of a CNN.
Therefore, C has n rows and m columns. Choose a set of
input patterns, and these input patterns can be written as a
matrix U = (U1, U2, …, Um), where U is corresponding to
the set of memory patterns, Ui ¼ ui1; u
i2; . . .; ui
n
� �T, and ui
j
denotes an input data of jth cell of a CNN in ith memory
pattern. Note that when the inputs of a CNN is Ui, the
memory pattern is ai, namely, if Ui is a set inputs of a
CNN, then ai is a set of outputs of the CNN corresponding
to Ui, where we can use uij; a
ij
� �to describe the
relationship.
Then, we divide all cells of a CNN into three small sets
on the basis of the memory patterns C. When all outputs of
a cell of the CNN in different memory pattern ai are 1, then
we classify the cell as set P = {Oa, Ob, …}, where Oa
denotes a-th cell in (2). When all outputs of a cell of the
CNN in different memory pattern ai are -1, then we
classify the cell as set Q = {Oc, Od, …}. When all outputs
of a cell of the CNN in different ai are not same, namely,
the outputs can be 1 and -1, then we classify the cell as set
R = {Oe, Of, …}. Let Oj = j. |P|, |Q| and |R| denote the
number of elements in sets P, Q, and R, respectively.
Let P1 ¼ Oa;P2 ¼ Ob; . . .;Pi ¼ Oj; . . .; P Pj j ¼ Oc; where
0 B i B |P|, 1 B j B n, a \ b \ j \ c, and Oj [ P. Both
definitions of Qi and Ri are similar with that of Pi,
respectively.
Let
K ¼
k1 0 � � � 0
0 k2 � � � 0
..
. ... . .
. ...
0 0 � � � kn
0
BBB@
1
CCCA
n�n
;
where, ki [ 0.
Let
D ¼ DR1;DR2
; . . .;DR Rj j
� �T
; A ¼ AR1;AR2
; . . .;AR Rj j
� �T
;
B ¼ BR1;BR2
; . . .;BR Rj j
� �T
and K ¼ KR1;KR2
; . . .;KR Rj j
� �T
;
where HRidenotes Rith row of matrix H.
Let
LD ¼ d�1;�1; d�1;0; d�1;1; d0;�1; d0;0; d0;1; d1;�1; d1;0; d1;1
� �;
LA ¼ a�1;�1; a�1;0; a�1;1; a0;�1; a0;0; a0;1; a1;�1; a1;0; a1;1
� �;
LB ¼ b�1;�1; b�1;0; b�1;1; b0;�1; b0;0; b0;1; b1;�1; b1;0; b1;1
� �;
K0 ¼
K 0 � � � 0
0 K � � � 0
..
. ... . .
. ...
0 0 0 K
0
BB@
1
CCA
NMmð Þ� NMmð Þ
;
C0 ¼ a1ð ÞT; a2ð ÞT; . . .; amð ÞT� �T
:
Choose l 2 f1; 2; . . .;mg; q 2 f1; 2; . . .; ng and k 2 fR1;
R2; . . .;R Rj jg. Next, we introduce some symbols as
follows.Let
U0 ¼ ððAþ BÞa1ÞTððAþ BÞa2ÞT; . . .; ððAþ BÞamÞT� �T
;
U0 ¼ U0R1;U0R2
; . . .;U0R Rj j;U0NMþR1
; . . .;U0l�NMþRk; . . .;U0m�NMþR Rj j
� �T
;
Nlq ¼
0 ulq�1ð ÞMþ1 ul
q�1ð ÞMþ2
ulq�1ð ÞMþ1 ul
q�1ð ÞMþ2 ulq�1ð ÞMþ3
ulq�1ð ÞMþ2 ul
q�1ð ÞMþ3 ulq�1ð ÞMþ4
..
. ... ..
.
ulqM�2 ul
qM�1 ulqM
ulqM�1 ul
qM 0
0BBBBBBBBBBBBB@
1CCCCCCCCCCCCCA
M�3
;
Nl ¼
0 Nl1 Nl
2
Nl1 Nl
2 Nl3
Nl2 Nl
3 Nl4
..
. ... ..
.
NlN�2 Nl
N�1 NlN
NlN�1 Nl
N 0
0BBBBBBBBBBBB@
1CCCCCCCCCCCCA
NMð Þ�9
; N ¼
N1
N2
..
.
Nm
0BBBBBBBBB@
1CCCCCCCCCA
NMmð Þ�9
;
N ¼ NR1;NR2
; . . .;NR Rj j ;NNMþR1; . . .;Nl�NMþRj
; . . .;Nm�NMþRk3
� �T
;
X0 ¼ XR1;XR2
; . . .;XR Rj j ;XNMþR1; . . .;Xl�NMþRk
; . . .;Xm�NMþRk3
� �T
;
D ¼ K0C0;
D ¼ DR1;DR2
; . . .;DR Rj j ;DNMþR1; . . .;Dl�NMþRk
; . . .;Dm�NMþRk3
� �T
;
n0q lð Þ ¼ cq �Xn
j¼1
aqjalj �Xn
j¼1
dqjulj:
3.3 Associative memories
In order to get parameters A, B, C, D, and V for a CNN with time
delay, Corollary 1 is used to achieve associative memories.
240 Neural Comput & Applic (2013) 23:237–244
123
Let ki [ max1� i� n cif g. We choose
U0 ¼ 0:5D; ð7Þ
and
Aþ B� �
Cþ DU ¼ KC: ð8Þ
Equation (7) shows that the sign ofPn
j¼1 ðaij þ bijÞaj of a
cell Oi is equal to the sign of output of the cell. Equation
(8) shows that sign ofPn
j¼1 ðaij þ bijÞaj þPn
j¼1 dijuj of a
cell Oi of a CNN is equal to the sign of output of the cell.
By Corollary 1, we know that sign ofPn
j¼1 ðaij þ bijÞaj þPnj¼1 dijuj of a cell of a CNN with time delay play
important role for the sign of output of the cell.
However, it is difficult to be obtain A; B and D through
(7) and (8). Therefore, (7) and (8) are needed to transform.
Equation (7) can be transformed as
X0 � ðLAþ LBÞ ¼ 0:5D; ð9Þ
Therefore,
LAþ LB ¼ 0:5pinvðX0ÞD: ð10Þ
where pinv(�) denotes pseudo inverse of a matrix.
Equation (8) can be transformed as
N � LD ¼ D� U0: ð11Þ
Therefore,
LD ¼ pinv N� �
D� X0 � ðLAþ LBÞ� �
: ð12Þ
Remark 2 When matrices X0 or N are irreversible, the
value of LA ? LB or LD is an approximate value.
Next, we discuss how to get bias vi for a cell in sets
P and Q.
Choose that vþ 2 R is the bias of all cells in set P, and
v� 2 R is the bias of all cells in set Q. Then, the regions of
vþ and v� can be get by Theorem 1.
If ai = 1 in (5), we have viðlÞ[ ci �Pn
j¼1
ðaij þ bijÞalj �Pn
j¼1 dijulj ¼ n0i lð Þ:
If ai = -1 in (5), we have vi lð Þ\ci �Pn
j¼1
ðaij þ bijÞalj �Pn
j¼1 dijulj ¼ n0i lð Þ:
Thus, we choose
vþ � max1� l�m;i2P
n0i lð Þ� ��� �� ð13Þ
as a bias of all cells in set P, and
v� � � max1� l�m;i2Q
n0
i lð Þn o���
��� ð14Þ
as a bias of all cells in set Q.
Remark 3 Note that cloning template A, B, and D are
computed by the cells in set R, which can reduce the effect
from the cells in set P and Q.
4 Design procedure of a CNN with time delay
In this section, we give a design procedure of parameters of
a CNN with time delay on the basis of the above theories.
Eleven steps are given as follows.
Step 1. Give a set of memory patterns a1, a2, …, am for
the CNN, where ai is a set of outputs of all cells of the
CNN, and m is the number of patterns. Let matrix
C = (a1, a2, …, am). Denote an input matrix
U = (U1, U2, …, Um), which is corresponding to the
matrix C.
Step 2. Determine time delay s.
Step 3. Divide all cells of the CNN into three sets P, Q,
and R. If all outputs of a cell in all memory patterns are
1, the cell will be classified as set P. If all outputs of a
cell in all memory patterns are -1, the cell will be
classified as set Q. If the outputs of a cell in all memory
patterns are 1 and -1, the cell will be classified as set R.
Step 4. Let biases vi (i [ R) of all cells in set R are zero.
Step 5. Determine all constants �cijð1� i�N; 1� j�MÞ.Obtain coefficient matrix C.
Step 6. Determine matrix K such that ki[max1�i�n cif gand get matrix K0.Step 7. In set R, compute cloning template �Aþ �B from
(10). Obtain coefficient matrix A ? B.
Step 8. In set R, compute cloning template �D from (12).
Obtain coefficient matrix D.
Step 9. Compute n0i lð Þði 2 P; 1� l�mÞ in set P. Choose
vþ[ max1� l�m;i2P n0i lð Þ� ��� �� in terms of (13), and bias
vi of all cells in set P is equal to vþ.
Step 10. Compute n0i lð Þði 2 Q; 1� l�mÞ in set Q. Choose
v�\�max1� l�m;i2Q n0i lð Þ� ��� �� in terms of (14), and bias
vi of all cells in set Q is equal to v�.
Step 11. Synthesize the CNN with the connection
weight matrices A, B, C, D, time delay s, and bias
vector V.
5 Numerical examples
In this section, we will give some numerical simulations to
verify the theoretical results in this paper.
Example 1 Consider the same example introduced in
[16]. The inputs and the output patterns of a CNN are
represented by two pairs of (5 9 5)-pixel images showed
Neural Comput & Applic (2013) 23:237–244 241
123
in Fig. 1 (black pixel = 1, white pixel = -1), where the
inputs of the CNN compose the word ‘‘MO’’ in Fig. 1a, and
the patterns to be memorized to constitute the word ‘‘LS’’
in Fig. 1b.
We design all parameters of a CNN with time delay for
associative memories on the basis of design procedure in
Sect. 4.
Step 1. In terms of Fig. 1, we get memory patterns
ð�1; 1;�1; . . .;�1ÞT; ð�1; 1; 1; . . .;�1ÞT and input pat-
terns 1;�1;�1; . . .; 1ð ÞT; �1; 1; 1; . . .; 1ð ÞT. Let matrix
C ¼ ðð�1; 1;�1; . . .;�1ÞT; ð�1; 1; 1; . . .;�1ÞTÞ and
U ¼ ðð1;�1;�1; . . .; 1ÞT; ð�1; 1; 1; . . .; 1ÞTÞStep 2. Choose s = 1.
Step 3. From memory patterns, all cells of the CNN can
be divided into three sets, P = {O2, O7, O12, O22, O23,
O24}, Q = {O1, O5, O6, O8, O9, O10, O11, O15, O16, O18,
O20, O21, O25}, R = {O3, O4, O13, O14, O17, O19}.
Step 4. Choose bias of all cells in set R is equal to zero,
namely, v3 = v4 = v13 = v14 = v17 = v19 = 0.
Step 5. Let �cij ¼ 1; 1� i�N; 1� j�M; then we can
obtain C = diag(1, 1, …, 1)n9n.
Step 6. Let K = diag(3, 3, …, 3)n9n, then we have
K0 ¼ diagð3; 3; . . .; 3Þnm�nm.
Step 7. From (10), we get LA ? LB = (0, 0, 0, 0, 1.5, 0,
0, 0, 0)T. Then, we have matrix A ? B = diag(1.5)n9n.
Step 8. From (12), we get LD ¼ ð�1:1761;�2:2045;
�0:4602; 1:2386; 0:4602; 0:0909; 0:2898; 0:2443;
0:1875ÞT. Then, we can obtain matrix D.
Step 9. Let vþ ¼ 8 in set P. Therefore, v2 = v7 =
v12 = v22 = v23 = v24 = 8.
Step 10. Let v� ¼ �8 in set Q. Therefore,
v1 = v5 = v6 = v8 = v9 = v10 = v11 = v15 = v16 =
v18 = v20 = v21 = v25 = -8.
Step 11. Synthesize the CNN with A, B, C, D, and V.
Note that the �aii þ �bii [�cij in the example; however, in
previous paper [15],Pn
j¼1 �aij þ �bij
�� ��\�cij must be satisfied.
Therefore, in the paper, though the initial states of a CNN
have to return to zero, we reduce many limitations for the
relationships between cloning templates.
Through the above eleven steps, we can get a CNN that
achieve associative memories for ‘‘MO’’ to ‘‘LS’’. When
the inputs of the CNN are ‘‘M’’, we can get time response
curves of the CNN in Fig. 2. In Fig. 2, we find that states of
all cells will be stable after a time. When states of all cells
are stable, the value of equilibrium point is
x� ¼ ð�8:6986; 10:5338;�3:0567;�2:9658;�9:7440;
� 10:6360; 12:8292;�5:0225;�7:0339;�8:8747;
� 11:7383; 7:3066;�2:5226;�2:4657;�13:7041;
� 10:8178; 2:6135;�12:5792; 2:8862;�11:3519;
� 10:8747; 11:6758; 11:5508; 10:8122;�11:3065ÞT:
Therefore, we know that all outputs of the CNN
corresponding to the equilibrium point are
� 1; 1;�1;�1;�1;�1; 1;�1;�1;�1;�1; 1;�1;�1;
� 1;�1; 1;�1;�1;�1;�1; 1; 1; 1;�1;
where the outputs of the CNN are same with ‘‘L’’.
When the inputs of the CNN are ‘‘O’’, we can get time
response curves of the CNN in Fig. 3. Then the value of
equilibrium point is
x� ¼ ð�9:9258; 8:5793; 3:1476; 2:9430;�7:8463;
� 8:1816; 6:9089;�12:7951;�15:2950;�13:7041;
� 28:1816; 7:8293; 3:5339; 2:4657;�11:3519;
� 8:1816;�2:7953;�6:6020; 3:5339;�10:7724;
� 8:1247; 8:2441; 12:7780; 14:2098;�8:8293ÞT:
Fig. 2 When the inputs of the CNN are ‘‘M’’, time response curves of
all cells of a CNN are shownFig. 1 a Inputs of a CNN, b outputs of the CNN or memory patterns
242 Neural Comput & Applic (2013) 23:237–244
123
Therefore, we know that all outputs of the CNN
corresponding to the equilibrium point are
� 1; 1; 1; 1;�1;�1; 1;�1;�1;�1;�1; 1; 1; 1;�1;�1;�1;
� 1; 1;�1;�1; 1; 1; 1;�1;
where the outputs of the CNN are same with ‘‘S’’.
Example 2 No matter what the inputs of a CNN are, if the
set of outputs of the CNN is always ‘‘M’’, we can only use
bias of the CNN to achieve the associative memories.
Choose
�A ¼�0:1 �0:1 �0:10:1 0:3 0:1�0:1 �0:1 �0:1
24
35; �B ¼
0:1 0:1 0:1�0:1 0:2 �0:10:1 0:1 0:1
24
35;
�D ¼0:1 �0:15 0:10:1 0:1 �0:10:1 �0:15 0:1
24
35; s ¼ 1
v1 ¼ v5 ¼ v6 ¼ v7 ¼ v9 ¼ v10 ¼ v11 ¼ v13 ¼ v15
¼ v16 ¼ v20 ¼ v21 ¼ v25 ¼ 4 and
v2 ¼ v3 ¼ v4 ¼ v8 ¼ v12 ¼ v14 ¼ v17 ¼ v18 ¼ v19 ¼ v22
¼ v23 ¼ v24 ¼ �4:
Then, when the initial states of CNN are 0, we can get a CNN
by the above parameters. No matter what the inputs of the
CNN are, the set of the outputs of the CNN is always ‘‘M’’.
6 Conclusions
In the paper, some new methods about associative
memories based on CNNs with time delay are given. First,
a theorem and a corollary are obtained, where the initial
states of CNNs are zero. It is easy to return to zero for a
CNN in applications. In addition, in order to achieve
associative memories, we get some broad conditions for
design cloning templates of CNNs. Then, some new
methods for associative memories are given based on the
theorem and corollary, and a design procedure is obtained.
Finally, some examples are given to show that our methods
are effective and useful.
Acknowledgments This work was supported in part by the project
of graduate innovation of Chongqing University under Grant
200909C1011, in part by the National Natural Science Foundation of
China under Grant 60973114, Grant 61170249 and Grant 61003247,
in part by the Natural Science Foundation project of CQCSTC under
Grant 2009BA2024, and in part by the State Key Laboratory of Power
Transmission Equipment & System Security and New Technology,
Chongqing University, under Grant 2007DA10512711206, in part by
Teaching & Research Program of Chongqing Education Committee
(KJ110401).
References
1. Chua LO, Yang L (1988) Cellular neural networks: theory. IEEE
Trans Circuits Syst 35:1257–1272
2. Chua LO, Yang L (1988) Cellular neural networks: applications.
IEEE Trans Circuits Syst 35:1273–1290
3. Crounse KR, Chua LO (1995) Methods for image processing and
pattern formation in cellular neural networks: a tutorial. IEEE
Trans Circuits Syst 42:583–601
4. Sziranyi T, Csicsvari J (1993) High-speed character recognition
using a dual cellular neural network architecture (CNND). IEEE
Trans Circuits Syst II 40:223–231
5. Tetzlaff R (2002) Cellular neural networks and their applications.
World scientific, Singapore
6. Brucoli M, Carnimeo L, Grassi G (1995) Discrete-time cellular
neural networks for associative memories with learning and
forgetting capabilities. IEEE Trans Circuits Syst I 42:396–399
7. Liu DR, Michel AN (1993) Cellular neural networks for asso-
ciative memories. IEEE Trans Circuits Syst II 40:119–121
8. Liu DR, Michel AN (1994) Sparsely interconnected neural net-
works for associative memories with applications to cellular
neural networks. IEEE Trans Circuits Syst II 41:295–307
9. Liu DR (1997) Cloning template design of cellular neural net-
works for associative memories. IEEE Trans Circuits Syst I
Fundam Theory Appl 44:646–650
10. Brucoli M, Carnimeo L, Grassi G (1995) Discre-time cellular
neural networks for associative memories with learning and
forgetting capabilities. IEEE Trans Circuits Syst I Fundam The-
ory Appl 42:396–399
11. Grassi G (1997) A new approach to design cellular neural net-
works for associative memories. IEEE Trans Circuits Syst I
Fundam Theory Appl 44:835–838
12. Grassi G (2001) On discrete-time cellular neural networks for
associative memories. IEEE Trans Circuits Syst I Fundam Theory
Appl 48:107–111
13. Zeng ZG, Huang DS, Wang ZF (2005) Memory pattern analysis
of cellular neural networks. Phys Lett A 342:114–128
14. Zeng ZG, Huang DS, Wang ZF (2008) Pattern memory analysis
based on stability theory of cellular neural networks. Appl Math
Model 32:112–121
15. Zeng ZG, Wang J (2007) Analysis and design of associative
memories based on recurrent neural networks with linear
Fig. 3 When the inputs of the CNN are ‘‘O’’, time response curves of
all cells of a CNN are shown
Neural Comput & Applic (2013) 23:237–244 243
123
saturation activation functions and time-varying delays. Neural
Comput 19:2149–2182
16. Zeng ZG, Wang J (2008) Design and analysis of high-capacity
associative memories based on a class of discrete-time recurrent
neural networks. IEEE Trans Syst Man Cybern B Cybern
38:1525–1536
17. Zeng ZG, Wang J (2009) Associative memories based on con-
tinuous-time cellular neural networks designed using space-
invariant cloning templates. Neural Netw 22:651–657
18. Zheng PS, Tang WS, Zhang JX (2010) Efficient continuous-time
asymmetric Hopfield networks for memory retrieval. Neural
Comput 22:1597–1614
19. Zheng PS, Zhang JX, Tang WS (2011) Learning associative
memories by error backpropagation. IEEE Trans Neural Netw
22:347–355
20. Zeng ZG, Huang DS, Wang ZF (2005) Global stability of a
general class of discrete-time recurrent neural networks. Neural
Process Lett 22:33–47
21. Gilli M, Biey M, Checco P (2004) Equilibrium analysis of
cellular neural networks. IEEE Trans Circuits Syst I Regul Pap
51:903–912
22. Takahashi N (2000) A new sufficient condition for complete
stability of cellular neural networks with delay. IEEE Trans
Circuits Syst I Fundam Theory Appl 47:793–799
23. Li XM, Huang LH, Wu JH (2003) Further results on the stability
of delayed cellular neural networks. IEEE Trans Circuits Syst I
Fundam Theory Appl 50:1239–1242
24. Liao XX, Wang J, Zeng ZG (2005) Global asymptotic stability
and global exponential stability of delayed cellular neural net-
works. IEEE Trans Circuits Syst II Express Briefs 52:403–409
25. Liao XX, Luo Q, Zeng ZG, Guo YX (2008) Global exponential
stability in lagrange sense for recurrent neural networks with time
delays. Nonlinear Anal Real World Appl 9:1535–1557
26. Zheng CD, Zhang HG, Wang ZS (2009) New delay-dependent
global exponential stability criterion for cellular-type neural
networks with time-varying delays. IEEE Trans Circuits Syst II
Express Briefs 56:250–254
27. Xiao SP, Zhang XM (2009) New globally asymptotic stability
criteria for delayed cellular neural networks. IEEE Trans Circuits
Syst II Express Briefs 56:659–663
28. Zheng CD, Zhang HG, Wang ZS (2010) Improved robust sta-
bility criteria for delayed cellular neural networks via the LMI
approach. IEEE Trans Circuits Syst II Express Briefs 57:41–45
29. Chen WH, Zheng WX (2010) A new method for complete sta-
bility analysis of cellular neural networks with time delay. IEEE
Trans Neural Netw 21:1126–1139
30. Zeng ZG, Wang J, Liao XX (2004) Stability analysis of delayed
cellular neural networks described using cloning templates. IEEE
Trans Circuits Syst I Regul Pap 51:2313–2324
31. Zeng ZG, Wang J (2006) Complete stability of cellular neural
networks with time-varying delays. IEEE Trans Circuits Syst I
Regul Pap 53:944–955
32. Jiang HJ, Teng ZD (2004) Global exponential stability of cellular
neural networks with time-varying coefficients and delays.
Neural Netw 17:1415–1425
244 Neural Comput & Applic (2013) 23:237–244
123
top related