Professional Documents
Culture Documents
Soft Computing - Adaptive Resonance Theory - Amit Mishra - SISTec GandhiNaga
Soft Computing - Adaptive Resonance Theory - Amit Mishra - SISTec GandhiNaga
Soft Computing - Adaptive Resonance Theory - Amit Mishra - SISTec GandhiNaga
SOFT COMPUTING
Adaptive Resonance Theory Network(ART)
Unsupervised Learning
Explain
•Introduction to ART
•ART Network architecture
•ART algorithm
•ART Types
•ART 1 training
•ART 2 training
•Applications
SISTec, Gandhi Nagar, Bhopal
Introduction
• Grossberg, S. (1987), Competitive learning: From interactive activation to
adaptive resonance Archived 2006-09-07 at the Wayback Machine, Cognitive
Science (journal), 11, 23-63
• Expectations
Each new input match with prototype vector of cluster that is closely matches.
(the expectation)
If input is not match with prototype vector then new prototype is selected.
“In this way new learning do not erode the memories of previous learning”
SISTec, Gandhi Nagar, Bhopal
CATEGORISATION RESULT
OUTPUT LAYER
RESET GAIN
INPUT LAYER
INPUT PATTERN
• Operating Principle :
• Initially all unit are set to zero (F1 and F2)
• Once input pattern present, input signals sent until the learning trail is completed
• Vigilance parameter: degree of Similarity of the patterns assigned to the same cluster unit
ART Algorithm
NEW PATTERN Recognition
Comparison
CATEGORISATION
Initialise
Adapt winner node
Uncommitted node
ART Types
• ART1
• Unsupervised clustering of binary input vector
• Input pattern can arrive in any order
• Bottom-up and Top-down weights are controlled by differential
equation
• It can run stably with infinite patterns of input data
• ART2
• Unsupervised clustering of real-valued input vector
• Include a combination of normalization and noise suppression
• ART2 network complexity is higher than ART1 because more
processing needed in F1 layer
SISTec, Gandhi Nagar, Bhopal
ART1
Step 0: Initialize the parameters:
Step 05: Send input signal from F1(a) to the F1(b) layer
Recognition phase
1. Forward transmission via bottom up weights
Comparison phase
1. Backward transmission via Top-down weights
5. Repeat until:
• Or exhausted
SISTec, Gandhi Nagar, Bhopal
Vigilance Threshold
• Vigilance threshold sets quality of
clustering
• It shows he amount of similarity
with each prototype
• Low threshold:
• Large mismatched
• Few small clusters
• Mostly misclassification
• High threshold:
• Small no. of mismatched
• Many small clusters
• High precision
SISTec, Gandhi Nagar, Bhopal
ART2
Step 0: Initialise parameter:
Step 01: Do step 2-12 number of epoch times
Step 02: For each input vector s, do step 3-11
Step 03: Update F1 unit activations:
Applications
• Natural Language processing
• Document clustering
• Document retrieval
• Image Segmentation
• Character reorganisation
• Data Mining
• Fuzzy partitioning
SISTec, Gandhi Nagar, Bhopal
References:
• Principles of Soft computing, Dr. S.N.Sivanandam, Dr. S.N
• https://slideplayer.com/slide/5270012/