Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 17

Hopfield Network

Notes
by
Dr. B. Anuradha

12/08/21 1
12/08/21 2
12/08/21 3
• Neural networks were designed in analogy with the
brain. The brain’s memory, however, works by
association. For example, we can recognise a familiar
face even in an unfamiliar environment within 100-
200 ms. The brain routinely associates one thing with
another.
• To emulate the human memory’s associative
characteristics, we need a different type of network: a
recurrent neural network.

• A recurrent neural network has feedback loops from its


outputs to its inputs.
12/08/21 4
• The stability of recurrent networks intrigued
several researchers in the 1960s and 1970s.
However, none was able to predict which
network would be stable, and some researchers
were pessimistic about finding a solution at all.

• The problem was solved only in 1982, when


John Hopfield formulated the principle of
storing information in a dynamically stable
network.

12/08/21 5
12/08/21 6
12/08/21 7
o We have a ball rolling down a valley
o Bottom of the valley represents the pattern stored in Hopfield net
o Wherever the ball is initially placed, it will roll towards the
nearest local minimum – this represents the Hopfield net iteratively
processing the next network state
o Ball will eventually stop rolling at the bottom of the valley –
this represents the stable state of the Hopfield network
12/08/21 8
Basin of Attraction and Stable States

12/08/21 9
12/08/21 10
Given some partial pattern, Hopfield network will eventually
stabilise at the closest matching pattern

12/08/21 11
initial
step 1

-2

12/08/21 12
step 1 step 2 step 3
initial
-2

+3
+3

step 4 step 5 step 6

+3

-1
-3

12/08/21 13
12/08/21 14
12/08/21 15
12/08/21 16
12/08/21 17

You might also like