Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Qiangfu Zhao NN-I: Lecture 8 1

Lecture 8:
Lecture 8:
Function Approximation and
Function Approximation and
RBF Neural Networks
RBF Neural Networks
Qiangfu Zhao NN-I: Lecture 8 2
What is function approximation ?
norm. any be can and , tolerance the is 0 where
for , ) ( ) (


satifying

mapping a
find to is problem ion approximat function called so The
for , ) ( ) (
such that given is
mapping a , set a for that Suppose . to from
mapping a and , and sets wo Consider t
>



F x x f x f
f
x x f x g
g
F L F f
R L R F
l m
Qiangfu Zhao NN-I: Lecture 8 3
Some practical considerations
used. usually are s constraint tion regulariza
some well, generalize that solutions get To .
for bad but , set training for the good be can solution A
problem. posed - ill a is ion approximat function general, In
for , ) ( ) (


such that

find
should we practice, In solution. a of quality the evaluate to
used be cannot it unknown, is mapping original the Since
F- x
x x g x f
f


<
Qiangfu Zhao NN-I: Lecture 8 4
Polynomial based approximation
point. one only at n informatio
using of value ed approximat an get can we is, That
error. ion approximat or the reminder the is where
1 1
: follows as given is expansion Talor the enough,
smooth is function the If example. one is expansion Talor
. polynomial is ion approximat function for model One

1
1
f(x)
R
R a) (x
)! (n
(a) f
a) (x
!
(a) f
f(a) f(x)
n
n
n
) (n
+

+ +

+ =

L
Qiangfu Zhao NN-I: Lecture 8 5
Approximation based on multi-points
(a)(x-a) f f(a) f(x)
f(a) f(x)
a x

with e approximat
can we exists, derivative order first the If
ion approximat Linear
with e approximat
and , example nearest with the Replace
ion approximat neighbor Nearest
points. many use can We ion. approximat
function for enough not is point one practice, In
Qiangfu Zhao NN-I: Lecture 8 6
General polynomial approximation
noises are there
case in problem on optimizati quadratic a Solving 2)
equation linear us simultaneo a Solving 1)
include Methods . in examples
training the using found be can ts coefficien The

by ed approximat be can function a general, In
1
0

+ =

=
x a f(x)
f(x)
n-
i
i
i
Qiangfu Zhao NN-I: Lecture 8 7
Transformation based approximation
functions orthogonal any
use can we functions, sinusoidal the of Instead
)
2
1
: Synthesis
) : Analyze


=
=
d e F(j

f(x)
dx f(x)e F(j
x j
-
x -j
example one is transform Fourier
Qiangfu Zhao NN-I: Lecture 8 8
Basis function approximation
function. basis a called is where

: follows as form
general more a in formulated be can ions approximat
based ation transform and polynomial both fact, In
0
(x)
(x) a f(x)
k
k
N
k
k
=
=
Qiangfu Zhao NN-I: Lecture 8 9
Questions
What is the basis function in polynomial
approximation ?
What is the basis function in Fourier
transform ?
What is the basis function in a multi-layer
perceptron ?
All basis functions given above are global
functions.
Qiangfu Zhao NN-I: Lecture 8 10
Function approximation with
local basis functions
(RBF). function basis radial called is function basis
of kind This function. basis th - k the of variance
the and center the ly, respective are, and where
2
1
exp
: follows as given function
Gaussian the is function basis local of example One
2
2
i i
k
k
k
x
) x x

( (x) =
Qiangfu Zhao NN-I: Lecture 8 11
What kind of basis function shall we use ?
Local basis functions are good for
interpolation and global basis functions are
good for extrapolation
Generally speaking, a good approximator
should contain both local and global basis
functions
Evolution algorithm can be used to select
different basis functions in one system
Qiangfu Zhao NN-I: Lecture 8 12
RBF neural networks
A RBF neural network is a three layer neural
network
Input neurons: same as the MLP
Output neurons: linear combinations
Hidden neurons: basis functions
For a one-output network, the output is given by
) x x ( w o
i
N
k
k
=

=

1
Qiangfu Zhao NN-I: Lecture 8 13
Training of RBF neural networks
Provide the training patterns with known
function values
Use the training patterns as the centers of
the radial basis functions
Find the weights of the output neuron by
solving simultaneous linear equation
Qiangfu Zhao NN-I: Lecture 8 14
Problems to be solved
Too many hidden neurons if the number of
training patterns is large
Selection of important training data
Re-location of the centers to improve the performance
of the network
Optimizing the variances of each RBF
For noisy data, the approximation is ill-posed
Regularization is needed
Qiangfu Zhao NN-I: Lecture 8 15
How to regularize ?
The basic idea of regularization is to add a penalty in the
error function
Minimizing the error along with the penalty can find a
solution as smooth as possible
The cost function is given by
operator. al differenti linear a is
and parameter, tion regulariza the is 0 where


2
2
1
P
f P )) (x f (d ) f E(
i
P
i
i
>
+ =

Qiangfu Zhao NN-I: Lecture 8 16


A generalized network

You might also like