AIML Module 5.ScanFile

You might also like

Download as pdf
Download as pdf
You are on page 1of 54
~ Stephen Hawking faintosolve any non-linear and complex py Biz or simply called as a node in ANN, that sestem that consists of many such com inthe field of computer science are Natural je recognition, speech recognition, character recognition, text p computer vision, etc. ANNs also have been considerably used in other suchas Chemical industry, Medicine, Robotics, Communications, Banking, and Ih chapter aims to introduce the concepts necessary to understand the working of Artifical Neural Networks, bettinnenagt Peay (rae kaon + Understand the basics of human nervous system and biological * Learn about McCulloch & Pitts Neuron Mathematical Model of an + Know about the structure of Artificial Neural Network * Introduce different types of activation functions Explore about the first neural network model called ‘Perceptron’ binary classifier used for supervised learning, Introduce different types of Artificial Neural Networks Understand the concepts of learning, in Multi-Layer Perceptron Propagation F Study about Radial Basis Function Neural Network (RBFNN) which ; layer perceptron particularly useful for interpolation, function series prediction, classification and system control Explore Self Organizing Feature Map (SOFM) which isco for unsupervised learning ‘ that are the processing units f neurons Te, The human nervous ree ad where we are and even) "Oh, rat we 4% : ize and correlate things aroung att iat perve cells, ‘YPICAlY called ag tions called the Central Nervous nein and the spinal cord cons the PNS. The neurons are p. . neurons and interneurons, mere de body and bring it into the CNS, Sein i « sand transmit commands to the bod fr eonnect one neuron to another ree ary mitting it to another. The basic functionality rcess itand then transmit itto another neuron oF toabody motor neuro! ‘scan for information on ‘Recurrent Neurol Network 10.2 BIOLOGICAL NEURONS A typical biological neuron has four parts called of the neuron is called as soma. Dendrites accept the inpu body called soma. A i is ected by axons to around 10,000 neurons and through these axons the processed information is passed from one neuron to another neuron. A: gets fired if the input information crosses a threshold value and transmits signals to: neuron through a synapse. A synapse gets fired with an electrical impulse called. are transmitted to another neuron. A single neuron can receive synaptic inputs from or multiple neurons. These neurons form a network structure which proces and gives out a response. The simple structure of a biological neuron is shot dendrites, soma, axon and synapse. The body t information and process it in the cel Input Figure 10.1: A Biolog he body the ceil hrough nother which neuron mation 0.1 {TIFICIAL NEURONS Figure 10.2: an Artif lal Neuron Artificial Neuron ulloch & P, ated sum w ich is given to ie the neuron gets fired. The mathematieal B unit that receives a set of inputs 2,2... ; mn function ‘Net-sum’ E and their 4 (101) computes the The Sum Netsum = 3x. «ia binary step function which outputs a value 1 if the Net-oum is ow the threshold value 8. Therefore, the value @ and a 0 if the Net-sum is be ed to Net-sum as shown in Eq. (10.2). f ation function (Net ~ sum) (003) x28 (103) Threshold Activation function Summation fi) x function thematical Model 0 Mat Figure 10.3: McCulloch & Pitts Neuron M resent only @ few Boolean function, A can rep? tts Né monies abi) output. ‘or example, an AND, ns ‘an OR Boolean function jp.” jeuron ats and Pi! ynereas i rear vas are'l, fe eer OE and threshold values ang fixey * \ machine Learn me eculloch & Pi function ha) inary inP ron i nm fire when al yen one input 1S 10.3.2 Artificial Neural Network Structure . es a human brain which inhibits bone Re h with a set of neuron nodes se rwork (ANN) imi resented a5 a directed rap! is shown in Figure 10 4. The nodes in the graph a «information in ‘parallel. The network given oe ate en layer and output layer: The input layer es guy « it to the nodes in the hidden layer. The ed, metas ren layer are associated with synaptic a eg, odes or neurons perform some compas input information ( 1.) received and if the weighted sum ofthe inputs toa neurgy 4 8% the threshold or the activation evel of the neuron, then the neuron fires. Each neuron ge om activation function that determines the output of the neuron. The neuron transforms empl input signals by ting the sum of the product of input signals and weights and linearly to it Then, the activation function maps the weighted input sum to a rioting Adds bi value. Output ray. Thenode inthe output layer gives the outPat #5 2 single Bias Input layer Hidden layer Output layer Figure 10.4: igure 10.4: Artificial Neural Network Structure 10.3.3 Activation Functions Activation functions that map input si are mathematical functions associated wit ee ons sccited wih exch nero in i tone ete ee ik cocdes Wee ees ‘ en en ntcns nemusline beatae aa - Typical activation functi tin jons = ajons are t Values can be rend any ly used in binary perceptrons Nas ora tne map the input in the range of 1) | near . ip 2 Tn Ope da aa Lh meee a po ©, Video ange Boy Ps ‘ Mm aesome of the activation Functions useq i 4 oad image a swept Function or Linear Function in ANNs; . W# ity jae fi)=xvx rhe value of fix) increases linearly or pr l flz)={x f0sx51 pe 0 fx<0 itis linear function whose upper and lower limits are fixed: 7. Tanh- Hyperbolic Tangent Function The Tanh function is a scaled version of the si Italso suffers from the vanishing gradient problem. Vand 1. id function which is 0 one output values rang® © 2 nity ‘ tan W(x) = 7 tion / ical function generally used in deep leary reduces # e vanishing gradi i ient pytty It avoids OF oids oF le values and works like a Ii inp" a inca ve i x if x20 max (0/*) = " veut (ny layer that can hi tion used in é landle fume ity of a¢F Target class which ranges between 9 ¢ belonging o articular class is computed by gj put value PY the sum of the exponential values of the output lip > alls, ere i= 0---k (1 mee ee HEORY pie by Frank Rosenblatt in 1956, i a neo ised ain Hem Ad the McCulloch & Pitts Newent ik ficial neuron and Hebbian leaming nj ced variable weight ¥ extra input that represents fs t artificial neurons could actually learn weights and thresholds jgorithm that enabled the artificial neuronsip data by itself. The perceptron model (shown in Figure 105, LEARNING ri The first neural lassifier U' binary cl concepts, M combining two Of adjusting weights. He introdu to this model. He Pr) tha from data and came up with a supervi learn the correct weights from training consists of 4 steps: 1, Inputs from other neurons 2. Weights and bias 3, Net sum 4. Activation function Threshold Bias 0 { t sed learning a —{Error}— Input x; f= Gs y Activation function Summation function Figure 10.5: Perceptron Model Thus, the modified nei suron model recei : 1, A0,y../%, and ives a set of inputs %,/*, ¥ BA ase bias. The summation function ‘ puts 2, %,.---/%,, theirs iputs received by the neuron. ion ‘Net-sum’ Eq. (10.13) a linear nodel by ting rule nts bias resholds UTONS to ure 10,5) reight® ighted Net-sum puting the ‘Net-sum’, bias vat, P ue is 3% fix) = Activation fun ysion function is a binary step fap ann Nétstim Funct 9g, and a (if f(x) is below tion whic a is below the threshold = out value @ 7 y- JK Y x) pute the weighted sum by multi ultiplying the i INPUES With the Weights and ad activation function on the weighted sum: ¥ =Step (x, +:x,w)— 9) sum is above the threshold value, threshold value, output the value as postive ese sitive else output the ulate the error by subtracting the estimated estimated output ¥-._, from the des ‘aa sired output error e(@) = Yu Yong [lf error e(t) is positive, increase the perceptron output ¥ and if if it is egative rease the perceptron output Y.] e = te the weights if there is an error: Aw, = oC x(t)

You might also like