Hopfield Network

You might also like

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 32

Neural Networks and Fuzzy Systems

Hopfield Network
• A feedback neural network has feedback loops from
its outputs to its inputs. The presence of such loops
has a profound impact on the learning capability of
the network.
• After applying a new input, the network output is
calculated and fed back to adjust the input. This
process is repeated until the outcome becomes
constant.
• John Hopfield (1982)
– Associative Memory via artificial neural networks
– Optimisation
Neural Networks and Fuzzy Systems

Hopfield Network
x10 1 y1
w41 w31 w21 w11

x20 2 y2
.
w42 w32 w22 w12 .
.
.
.
.
xi0 i yi
. w43 w33 w23 w13 .
. .
. .

xn0 n yn
w44 w34 w24 w14

 1 if x  0 It is a dynamic system:
y j (t )  sgn( x j (t )); sgn   x(0)→y(0) →x(1) →y(1)…. →y*
- 1 if x  0
n
x j (t )   wij yi (t  1)
i 1
Neural Networks and Fuzzy Systems

Attractor
• Attractor
If a state x(t) in a region S and t → ∞,, x(t) →
x*
S is the attractive region.
if x*=desired state, x* is an attractor
if x*≠desired state, x* is a spurious attractor.
S
x(t)

x*
Neural Networks and Fuzzy Systems

Associative memory
• Nature of associative memory
– part of information given
– the rest of the pattern is recalled
• Hopfiled networks can be used as
associative memory. x*
– Design weight W so that
X*=the memorised patterns.
– Can store more than one. Capacity
increases
x(0)

x(0)
Neural Networks and Fuzzy Systems

Analogy with Optimisation


• The location of bottom of the
bowl (X0) represents the stored
pattern
• Ball’s initial position represents
the partial knowledge
• In corrugated surface, we can
store {X1, X2,…, Xn} as
memories, and recall one which
is closest to the initial state.
•Hopfield networks can also be used for optimisation:

1) Defining an energy E such that an attractor can minimise E

2) Difference from associative memory: expecting one/less


attractor but large attractive region
Neural Networks and Fuzzy Systems

Two Types of Associative Memory


• Autoassociative memory
Pattern Ii: Ii+Δ→Ii
• Heteroassociative memory
Pattern pairs Ii → yi : Ii+Δ→yi

– Hopfield networks are used as autoassociative memory


Neural Networks and Fuzzy Systems

Hebbian Rule
• Original rule proposed by Hebb (1949)
The organization behavior
When an axon of cell A is near enough to excite a cell B
and repeatedly or persistently takes parts in firing it,
some growth process or metabolic change takes place
in one or both cells such that A’s efficiency, as one of the
cells firing B, is increased.
That is, the correlation of activity between two cells is
reinforced by increasing the synaptic strength between them.
Neural Networks and Fuzzy Systems

Hebbian Rule
β

If a neuron α and a neuronβ are “on” at the same time, their synaptic
connection is strengthened. The next time some of them are activated
they will activate each other.
Neural Networks and Fuzzy Systems

Hebbian Rule
In other words:
1. If two neurons on either side of a synapse (connection) are activated
simultaneously (i.e. synchronously), then the strength of that synapse is
selectively increased.
Activation↑
This rule is often supplemented by:

2. If two neurons on either side of a synapse are activated


asynchronously, then that synapse is selectively weakened or
eliminated.
Activation↓
• Δwij = yixj
Neural Networks and Fuzzy Systems

Synaptic weights in Hopfield Networks


• m Patterns:
 v11  v12  v1m 
     
V 1   ;V 2    ;;V m    
v1n  vn2  vnm 
     

• add individual weights together


 m k k
wij   
 vi v j i j
k 1
0 i j (to avoid self-feedback)

• In matrixmform, it is the outer product of the patterns:


W  V kV kT  mI I is n×n identity matrix,
Superscript T denotes a matrix transpose.
k 1
Neural Networks and Fuzzy Systems

Associative Memory of Hopfield


Networks
• If V1,…,Vm are orthogonal, i.e. ViT Vj=0 for i≠j
then Vl→Vl, l=1..m, if n>m.
m
WV l  V kV kTV l  mV l
k 1

 nV l  mV l  ( n  m )V l
if n  m, V l (t  1)  sgn(WV l )  V l (t )

• If V1,…,Vl m arem notk orthogonal


WV  V V V  mV lkT l
Interference from other patterns,
k 1 should be weaker than (n-m)

 ( n  m)V l  V kV kTV l
k !l
Neural Networks and Fuzzy Systems

Storage Capacity
• As the number of patterns (m) increases, the chances
of accurate storage must decrease
• Hopfield’s empirical work in 1982
– About half of the memories were stored accurately in a net
of N nodes if m = 0.15N
• McCliece’s analysis in 1987
– If we require almost all the required memories to be stored
accurately, then the maximum number of patterns m is
N/(2lnN)
– For N = 100, m = 11
Neural Networks and Fuzzy Systems

Limitations of Hopfield Net


• The number of patterns that can be stored and
accurately recalled is severely limited
– If too many patterns are stored, net may converge
to a novel spurious pattern : not matched output
• Exemplar pattern will be unstable if it shares
many bits in common with another exemplar
pattern
Neural Networks and Fuzzy Systems

Example: Location Recall

Front of a door: Into a door:


Ultrasonic sensors: V1=[1,1,1] Ultrasonic sensors: V1=[-1,-1,-1]
Neural Networks and Fuzzy Systems

An example of memorization
• Memorize the two states, (1,1,1) and (-1,-1,-1).

1  1
Y1  1 Y2   1
1  1

• Transposed form of these vectors:


Y1T  1 1 1 Y2T   1  1  1

• The 3 x 3 identity matrix is:


1 0 0
0 1 0
 
0 0 1
Neural Networks and Fuzzy Systems

Example Cont’d…

• The weight matrix is determined as follows:

W  Y1Y1T  Y2Y2T  2 I
• So,

1  1 1 0 0 0 2 2
W  11 1 1   1 1  1  1  2 0 1 0  2 0 2
     
1  1 0 0 1 2 2 0

Next, the network is tested by the sequence of input vectors X1 and


X2, which are equal to the output (or target) vectors Y1 and Y2,
respectively.
Neural Networks and Fuzzy Systems

Example Cont’d…Network is tested.

• First activate the network by applying input vector X. Then, calculate the actual
output vector Y, and finally, compare the result with the initial input vector X.

Ym  sign(WX m ), m  1,2,......., M
Assume all thresholds to be zero for this example. Thus,

 0 2 2 1  1
  
Y1  sign  2 0 2 1   1
  
 2 2 0 1  1
     Y1=X1 and Y2=X2, so both
and states, (1,1,1) and (-1,-1,-1).
 0 2 2  1   1 are said to be stable.
   
Y2  sign  2 0 2  1    1
  
 2 2 0  1   1
     
Neural Networks and Fuzzy Systems

Example Cont’d…Other Possible States


Inputs Outputs
Possible state Iteration x1 x2 x3 y1 y2 y3 Fundamental mem.

1 1 1 0 1 1 1 1 1 1 1 1 1
-1 1 1 0 -1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1
1 -1 1 0 1 -1 1 1 1 1
1 1 1 1 1 1 1 1 1 1
1 1 -1 0 1 1 -1 1 1 1
1 1 1 1 1 1 1 1 1 1
-1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1
-1 -1 1 0 -1 -1 1 -1 -1 -1
1 -1 -1 -1 -1 -1 -1 -1 -1 -1
-1 1 -1 0 -1 1 -1 -1 -1 -1
1 -1 -1 -1 -1 -1 -1 -1 -1 -1
1 -1 -1 0 1 -1 -1 -1 -1 -1
1 -1 -1 -1 -1 -1 -1 -1 -1 -1
Neural Networks and Fuzzy Systems

Example Cont’d…Error compared to


fundamental memory

• The fundamental memory (1,1,1) attracts unstable states (-1,1,1), (1,-1,1)


and (1,1,-1).

• The fundamental memory (-1,-1,-1) attracts unstable states (-1,-1,1), (-1,1,-


1) and (1,-1,-1).
Neural Networks and Fuzzy Systems

Hopfield Network Training Algorithm

Step 1: Storage
The n-neuron Hopfield network is required to store a set of M
fundamental memories, Y1, Y2,… YM. The synaptic weight from
neuron i to neuron j is calculated as

M
 ym ,i ym , j , i ! j ,
wi , j   m 1
0 i j

where ym,i and ym,j are the ith and jth elements of the fundamental
memory Ym , respectively.
Neural Networks and Fuzzy Systems

Hopfield Network Training Algorithm

In matrix form, the synaptic weights between neurons are represented as


M
W   YmYmT  MI
m 1

The Hopfiled network can store a set of fundamental memories if the weight
matrix is symmetrical, with zeros in its main diagonal.
0 w12  w1i  w1n 
w 0  w  w 
 21 2i 2n  Where wij = wji. Once the
     weights are calculated, they
W   remain fixed.
w w
 i1 i 2  0  win 
    
 
 wn1 wn 2  wni  0 
Neural Networks and Fuzzy Systems

Hopfield Network Training Algorithm

Step 2: Testing
The network must recall any fundamental memory Ym when presented with it
as an input.

xm , i  y m , i , i  1,2,, n; m  1,2,  M
 n 
ym ,i  sign 
  wij xm , j 

 j 1 
where ym,i is the ith element of the actual vector Ym, and xm,j is the jth element of
the input vector Xm. In matrix form,

X m  Ym , m  1,2,, M
Ym  sign(WX m )
Neural Networks and Fuzzy Systems

Hopfield Network Training Algorithm

Step 3: Retrieval (If all fundamental memories are called perfectly proceed to
this step.)
Present an unknown n-dimensional vector(probe), X, to the network and
retrieve a stable state. That is,

X ! Ym , m  1,2,  , M

a) Initialize the retrieval algorithm of the Hopfield network by setting


x j (0)  x j j  1,2...., n
and calculate the initial state for each neuron

 n 
yi (0)  sign  wij x j (0) , i  1,2,...., n

 j 1 
Neural Networks and Fuzzy Systems

Hopfield Network Training Algorithm

Step 3: Retrieval

where xj(0) is the jth element of the probe vector X at iteration p=0, and
yj(0) is the state of neuron i at iteration p=0.
In matrix form, the state vector at iteration p=0 is presented as
Y (0)  signWX (0)

b) Update the elements of the state vector, Y(p), according to the


following rule:
n
xi ( p  1)   wij y j ( p )
j 1

yi ( p  1)  sign( xi ( p  1))
Neural Networks and Fuzzy Systems

Hopfield Network Training Algorithm

Step 3: Retrieval
Weights for updating are selected asynchronously, that is, randomly and one
at a time.
Repeat the iteration until the state vector becomes unchanged, or in other
words, a stable state is reached.

It can be proved:
The Hopfield network will always converge to a stable state when the
retrieval operation is performed asynchronously, if wij=wji, and wii=0.

A stable state or fixed point: n


yi ( p  1)  sign(  wij y j ( p ))
j 1
Neural Networks and Fuzzy Systems

Little model of Hopfield Network

Little model uses synchronous dynamics for retrival(Little and Shaw,


1975):
Y ( p  1)  sign (WY ( p ))

It can be proved:
The Little Model will always converge to a stable state or a limit cycle
of length at most 2 if wij=wji.

It is very easy to be implemented by using matrix manipulation, such as


in Matlab.
Neural Networks and Fuzzy Systems

Little model of Hopfield Network


0 1 1  1
1 0  1 1 
Example:
W  
 1  1 0  3
  1 1
  1  1  3 0   1  1
A stable state: Y (t )     Y (t  1)   
1 1
   

 1  1
1 1
1   1
Converge to the stable state: Y (t )     Y (t  1)   
1 1
   
 1   1

1 1 1


1 1 1
Limit Cycle 2: Y (t ) 
   Y (t  1)     Y (t  2)   
 1 1   1
    

 1 1
   1
Neural Networks and Fuzzy Systems

Hopfield network as a model for


associative memory
• Associative memory
– Associates different features with each other
• Karen  green
• George  red
• Paul  blue

– Recall with partial cues


Neural Networks and Fuzzy Systems

Neural Network Model of associative


memory
• Neurons are arranged like a grid:
Neural Networks and Fuzzy Systems

Setting the weights


• Each pattern can be denoted by a vector of -1s or
1s:

S p   1,1,1,1....,1,1,1  s1p , s2p , s3p ,..., sNp 
• If the number of patterns is m then:
m
wij   sip s jp
p 1

• Hebbian Learning:
– The neurons that fire together , wire together
Neural Networks and Fuzzy Systems

Learning in Hopfield net


Neural Networks and Fuzzy Systems

Summary

• Associative memory
• Discrete Hopfield Neural Networks
• Hebbian Learning Rule
Readings
Picton ’s book:
Haykin’s book: pp.289-308
Blackboad readings

You might also like