Download as pdf
Download as pdf
You are on page 1of 9
doth 2 Deep Learning fonvolutional ~ Heural “Helwork oe tH (Application) qiex Het a Bosie archi techure tA Fe cay athe cre a Var ont o& Basic. convoluker Function, 4E Pooling layers 6 (nin - _ uso of padding A tres — Types of Posting - ae Param erer Sharing - HH Basic architeckure of (HH )) foowolubbadl Heupel Hefworts (cant) ig othe extended version of artAr al neural nerwork (PtH). a) Tk ts Used to extruck the feature from grha-like matrix dalisek- : , 3) For examples visual datasets lite irnages + where “dala pulrerns play an extensive yole. ‘ ‘ 4) Convolution al Neural Hetworks . const sks oO caullie lagers Ke! Yop uk layer , conv oluvona layer pooling layer wy Pail y connected \ayess- 5) diagram onfone Flateaning Yploft mh ae yh fo 2 AT Pooling 3 if 2 ; * Conv olyhion i [ol a a Posting dense, Oakpat Topur Convolutional] lager gee” Lager Image layer t 6), Tnp ak. Layee on: His othe lager in wbhiln we give topuk bo. Wr model ~ Tapa wilh be tenaqe oF seqrence of image. itis 8 4 Eigst\ | \ $a) lt 6) Gnvolutional Layer (Conv) a) This ts the layer which i> used fo extract featuie frre the inpyt datasels: ) Tr applies ia. set of learnable Pikers Known ag ‘kernels! be the inpk Images - ) The Wermel are smaller matrices usvalls 2x2, 3X3 or sxe shane. 3) Tr sliaey over the input trnage daty y and computes the dot prodyck between Keme| weight 4 comesnonding inpuk Imay4e. e) The oatpuk of tHS layer fs. bePPerveA as a Peatuce mans". qv Poolin 4 lay eb -& a) This layer is incharge of reducing ) alien ension ality b) Tr ards “in Sveducing the amounk oh cons duking power required fe process the. dala. . \ €). feeling, can he divided into 2 tynes — Max pooling LATO, pooling a) Max gooling . The maximtum valve Pro OH, area” covered by MWe Kermel on : ; ima Ys MOY porling $e tt e) fverage pooling | The average St all vales in PRO part of image covered by Kernel Bs averay e porling: 3) Fally Conn etted layers = a) The Ally connected layer works wilt 4 Platrened inpay, whith means each ‘opak “Ys Scowpled with’ every neuron. aD) AP thak \ Mattened vector ts cent tq addihoa. Fully wonected layer pahere Whe endtinemalical Reacting! cone rahe o are perform ed - ©) The dassi®i cation procedure gels Say eis poink” : started a) chy akon Pun chion a) By addin o adhvahon funchon terete! oatpur of =the preceding layer 1 Ih adds non \oearty te fhe nehwoork. b) Some commvdn sachy.ation fan chon are Rely: max (01) , Tanh 23, Cal ery LRelM . max (oix) + -ve slope x ain (ox J ek SRB Ba ) GHA canhikecare Alex neh | Plex Her was the first obnuolutional networts- wantin weed PY bp boosh gerFormance convelubone \ The arthitecture consis A layer Ss cnax-pooliny layeéh 1 layer 2 fully, conne cred slayer. 4 2 pormaljrzakva A softrnay \ayey. e) ) consists Fach -convolytonah layers non-linear convolarerral ager Rthens ian de ot achvation Rinchoo VRopy * The pooling — layers are, used perform max pooling: Byed ane to the presence. Top ar slie is of Fully corne ched “Ayers: funchon ob Frocs @® a) Convolutional layer», b) Pooling layer c) Fully wnnecked layer 4) Adivalien Rin chon, Ey plain H Variant ot ») wala» .) ‘output Basic convolution Function % jory puted, only ab glaces where the entite Kernel Wes ‘incide” He’ inpet 2) No -1e to padding 15° performed . eee GF bsdyganoy 2 ; A J 2) for Kemel sine ‘Kk! th any dimensron , tye (re Ho inguk shane “dtm? tn the ‘alll become YU egal LY tn We? Suku: » e802 H sire of oxtpah A ths dentin. a0) POS Ay ores SAMS, 2) Essenli ally y sie is WW, Pe ippuk ig shins ah eaty layer shnoliage restnet arcnike cure ow} Sd 8) SAM gue! bs a ts ero - padded , sata tah © pete diMoNSios ‘Lolneye Kern e\ padaed 2° by Vs 33 1ews in Pak dimension ; i 4 3 The nor, ok en UnitS Conne cked to by Poy. order ately 1 les than Phat center pixels 0) Fall” — ni — [rough 7eros are added Br every inixe| fo. pe visited ‘K! Hmmes tn each direchin , : resulting (/n 90 ipa image “of wrahy Mead heh. a Eh inpur pixel is connected to same no. % oabpub ants: — Tod kermns, af Fest-sero accuracy, fhe ophindl p padding 6 sor where, behy Cen same + valid : ») Tied. 1D on. ) Tr offers connprap'se — hehween a convelubvona r A \ocally connected lay ey. 2%: Raker Yao, learning a separate seb of 0 weights ak every spaHal localiYon 1 Yow leam as 60h of. Kernels that yor rote are igh ver ; hog PARK + @ aa &) ay Yaran ele Sharing “* ) Tr rePers fo sing same param ber | for enore = MAD ON Buncko® ih mode} x k 5 i} a) Ws “yedtes “00. ot paranseter needed Yo Wain the nekworks —, making tk nore ebPreient - ‘ 3d) sharing of para cers glows CNN’ bo 2 \ W " ePPectvely — cantare ‘Vocal pattems A Poaltr?s tn Ienage- qk % guitable for basis ke tmage classi Pication , objec aerecton ek. s) Didar - BL PAR ARS Pea &” a) 6) This help you in on time" because you will use Pewor fesources to fuln he mode), a) To implemen} param eker sharing in’ UN's yor FIst denot® a “ single . 2 - dimensteng) shite ok denth As “aenth slice “ consrruin tre Seaton Tn eat @) You hen dent gee lk we same werahts A bias. a) Drqoarles' a) Youn wt'\ cate be, 28 fave advantege ot sararneieaned jYne _ ing Krnages yor are training are “spenere convered etracture "|

You might also like