Soft Computing - Adaptive Resonance Theory - Amit Mishra - SISTec GandhiNaga

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

SISTec, Gandhi Nagar, Bhopal

SOFT COMPUTING
Adaptive Resonance Theory Network(ART)
Unsupervised Learning

Amit Kumar Mishra


Assistant Professor
Department of Computer Science & Engineering
SISTec, Gandhi Nagar, Bhopal

Explain
•Introduction to ART
•ART Network architecture
•ART algorithm
•ART Types
•ART 1 training
•ART 2 training
•Applications
SISTec, Gandhi Nagar, Bhopal

Introduction
• Grossberg, S. (1987), Competitive learning: From interactive activation to
adaptive resonance Archived 2006-09-07 at the Wayback Machine, Cognitive
Science (journal), 11, 23-63

• Adaptive Resonance Theory(ART) is an unsupervised, clustering, neural


network architecture.

• ART networks tackle the stability-plasticity dilemma:


• Maintain the plasticity required to learn new patterns, while preventing the
modification of pattern that have been learned previously.
• Stability is achieved by no stored pattern is modified if input is not matched
with any pattern.
• Plasticity: Create new cluster with new weights if input not classified in
existing cluster.
• Stability: Existing cluster will not delete if new input is not getting classified on
them.
SISTec, Gandhi Nagar, Bhopal

ART Network architecture


• Fundamental Architecture

Three group of neurons are used to build an ART network :

1. Input processing neuron (F1 layer)

2. Clustering unit (F2 layer)

3. Control mechanism (control degree of similarity of pattern placed on the


same cluster)

• Expectations
Each new input match with prototype vector of cluster that is closely matches.
(the expectation)
If input is not match with prototype vector then new prototype is selected.

“In this way new learning do not erode the memories of previous learning”
SISTec, Gandhi Nagar, Bhopal

CATEGORISATION RESULT

OUTPUT LAYER

RESET GAIN

INPUT LAYER

INPUT PATTERN

• Operating Principle :
• Initially all unit are set to zero (F1 and F2)

• Once input pattern present, input signals sent until the learning trail is completed

• Vigilance parameter: degree of Similarity of the patterns assigned to the same cluster unit

• Reset: control the state of each node in F2 layer

• F2 layer states: Active, Inactive and Inhabited


SISTec, Gandhi Nagar, Bhopal

ART Algorithm
NEW PATTERN Recognition

Comparison
CATEGORISATION

Initialise
Adapt winner node
Uncommitted node

• New input pattern match with stored cluster prototype vector


• If matches, pattern joins best cluster and weight adapted
• If not, new cluster initialised with pattern as prototype
SISTec, Gandhi Nagar, Bhopal

ART Types
• ART1
• Unsupervised clustering of binary input vector
• Input pattern can arrive in any order
• Bottom-up and Top-down weights are controlled by differential
equation
• It can run stably with infinite patterns of input data
• ART2
• Unsupervised clustering of real-valued input vector
• Include a combination of normalization and noise suppression
• ART2 network complexity is higher than ART1 because more
processing needed in F1 layer
SISTec, Gandhi Nagar, Bhopal

ART1
Step 0: Initialize the parameters:

Initialize the weights:

Step 01: While stopping condition is false do step 2-13


Step 02: For each training input, do step 3-12
Step 03: Set activations of all F2 unit to zero
Set activation of F1(a) unit to input vector s
Step 04: Compute norm of s:
SISTec, Gandhi Nagar, Bhopal

Step 05: Send input signal from F1(a) to the F1(b) layer

Step 06: For each F2 node that is not inhabited:

Step 07: While reset is true, do step 8 to 11


Step 08: Find J such that for all node j
If , then all nodes are inhabited and this
pattern can not be clustered.
Step 09: Recompute activation X of F1(b)
SISTec, Gandhi Nagar, Bhopal

Step 10: Compute the norm of x:

Step 11: Test for reset

Step 12: Update the weight for node J (fast learning)

Step 13: Test for stopping condition


SISTec, Gandhi Nagar, Bhopal

Recognition phase
1. Forward transmission via bottom up weights

2. Input pattern matched with bottom up weights of output


nodes.

3. Best matching nodes fires (winner take all layer)

4. Pattern associated to closest matching prototype


SISTec, Gandhi Nagar, Bhopal

Comparison phase
1. Backward transmission via Top-down weights

2. Vigilance Test: class prototype match with input pattern

3. If pattern matched, then categorisation successful


completed and resonance achieved.

4. If not, try for best matching

5. Repeat until:

• Vigilance test pass

• Or exhausted
SISTec, Gandhi Nagar, Bhopal

Vigilance Threshold
• Vigilance threshold sets quality of
clustering
• It shows he amount of similarity
with each prototype
• Low threshold:
• Large mismatched
• Few small clusters
• Mostly misclassification
• High threshold:
• Small no. of mismatched
• Many small clusters
• High precision
SISTec, Gandhi Nagar, Bhopal

ART2
Step 0: Initialise parameter:
Step 01: Do step 2-12 number of epoch times
Step 02: For each input vector s, do step 3-11
Step 03: Update F1 unit activations:

Update F1 unit activations again:


SISTec, Gandhi Nagar, Bhopal

Step 04: Compute signals to F2 unit:

Step 05: While reset is true, do step 6-7


Step 06: Find F2 unit with largest signal ( J is defined
such that for j=1….m)
Step 07: Check for reset

Reset is True; Perform step 05

Reset is False; go-to step 08


SISTec, Gandhi Nagar, Bhopal

Step 08: do step 9-11, for specified number of learning


iterations
Step 09: Update weight for winning unit J:

Step 10: Update F1 activations:

Step 11: Check for the stopping condition of weight


updation.
Step 12: Check for the stopping condition for number
of epochs.
SISTec, Gandhi Nagar, Bhopal

Applications
• Natural Language processing

• Document clustering

• Document retrieval

• Image Segmentation

• Character reorganisation

• Data Mining

• Fuzzy partitioning
SISTec, Gandhi Nagar, Bhopal

References:
• Principles of Soft computing, Dr. S.N.Sivanandam, Dr. S.N

Deepa, 2nd edition, Wiley publication.

• https://slideplayer.com/slide/5270012/

You might also like