Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 42

Instance – based Learning

Instance – based Learning


 Instance-based learning methods simply store the
training examples
 Generalizing beyond these examples is postponed
until a new instance must be classified
 Each time a new query instance is encountered, its
relationship to the previously stored examples is
examined in order to assign a target function value
for the new instance
Instance – based Learning
Different Learning Methods
 Eager Learning
 Explicit description of target function on the whole
training set
 Instance-based Learning
 Learning=storing all training instances
 Classification=assigning target function to a new
instance
 Referred to as “Lazy” learning
Instance-based Learning

Its very similar to a


Desktop!!
Instance-based Learning
 K-Nearest Neighbor Algorithm
 Weighted Regression
 Radial Basis Functions
 Case-based reasoning
K-Nearest Neighbor
 All instances correspond to points in an n-dimensional
Euclidean space

 Classification is delayed till a new instance arrives

 Classification done by comparing feature vectors of


the different points

 Target function may be discrete or real-valued


1-Nearest Neighbor
3-Nearest Neighbor
K-Nearest Neighbor
The k- Nearest Neighbor algorithm for
approximation a discrete-valued target function
The K- Nearest Neighbor algorithm for
approximation a real-valued target function
Example
Exercise

Customer Age Income (K) No. of cards Response

John 35 35 3 Yes

Rachel 22 50 2 No

Ruth 63 200 1 No

Tom 59 170 1 No

Neil 25 40 4 Yes

David 37 50 2 ?
Answer

3-
x1 x2 x3 d Rank Nearest
35 35 3 1
15.16575 Yes
22 50 2 54.6626 3 No
63 200 1 5
209.6902 No
59 170 1 179.95 4 No
25 40 4 47.3392 2 Yes
One leave out cross validation KNN-
Watch
https://www.youtube.com/watch?v=Lvs6G_gdgCM
Distance –Weighted Nearest
Neighbor Algorithm
 The refinement to the k-NEAREST NEIGHBOR Algorithm
is to weight the contribution of each of the k neighbors
according to their distance to the query point xq, giving
greater weight to closer neighbors.

 For example, in the k-Nearest Neighbor algorithm,


which approximates discrete-valued target functions, we
might weight the vote of each neighbor according to the
inverse square of its distance from xq
Distance-Weighted Nearest Neighbor Algorithm for
approximation a discrete-valued target functions
Distance-Weighted Nearest Neighbor Algorithm for
approximation a Real-valued target functions
Locally Weighted Regression
Locally Weighted Regression
Case-Based Reasoning (CBR)
 CBR instances are typically represented using
more rich symbolic descriptions, and the methods
used to retrieve similar instances are
correspondingly more elaborate

 CBR has been applied to problems such as


conceptual design of mechanical devices based on
stored library of previous design
Case-Based Reasoning in CADET
 Instances represented by rich structural descriptions
 Multiple cases retrieved (and combined) to form solution
to new problem
 Tight coupling between case retrieval and problem
solving
Bottom line:
 Simple matching of cases useful for tasks such as
answering help-desk queries
 Area of ongoing research

28
Case-Based Reasoning in CADET
CADET: 75 stored examples of mechanical devices
 each training example:

<qualitative function, mechanical structure>


 new query: desired function
 target value: mechanical structure for this function

Distance metric: match qualitative function descriptions


A stored case : T-junction pipe
Stru cture : Function:
Q 1 ,T 1 T = temperature Q1 +
Q = waterflow
Q3
Q2 +
Q 3 ,T 3
T1 +
T3
T2 +
Q 2 ,T 2
A problem specification : Water faucet
+
Stru cture : Function: Cc Qc +
+ + + Qm

? Ch Qh

-
+
Tc

+
+ Tm
Th +
Neural Network
 Based on nature, neural networks are the usual
representation we make of the brain : neurons
interconnected to other neurons which forms a
network.
 A simple information transits in a lot of them
before becoming an actual thing, like “move the
hand to pick up this pencil”.
Artificial Neural Network (ANN)
The operation of a complete
neural network is
straightforward

One enter variables as


inputs (for example an
image if the neural
network is supposed to tell
what is on an image), and
after some calculations, an
output is returned
(following the first
example, giving an image
Neural networks can usually be read from left to right. Here, the first
layer is the layer in which inputs are entered. There are 2 internals
layers (called hidden layers) that do some math, and one last layer
that contains all the possible outputs.
What does a neuron do ?

Take all values from connected neurons multiplied by their


respective weight, add them, and apply an activation function.
Then, the neuron is ready to send its new value to other neurons.
Activation Function
 Activationfunction usually serves to turn the total value
calculated before to a number between 0 and 1
Radial Basis Function (RBF)
 Radial basis function (RBF) networks have a fundamentally different
architecture than most neural network architectures.

 Mostneural network architecture consists of many layers and introduces


nonlinearity by repetitively applying nonlinear activation functions.

 RBFnetwork on the other hand only consists of an input layer, a single


hidden layer, and an output layer.
Radial Basis
The input layer is not a computation
Function (RBF) layer, it just receives the input data
and feeds it into the special hidden
layer of the RBF network.

The computation that is happened


inside the hidden layer is very
different from most neural networks,
and this is where the power of the
RBF network comes from.

The output layer performs the


prediction task such as classification
or regression.
Ref
 https://www.youtube.com/watch?v=pAJSisdFpeI

You might also like