Phn Chnh Vo Nhn Dng Khung Mt Ngi GVPT: PGS.TS Nguyn Hu Phng HV: V Tn Ti i Hc Quc Gia Thnh Ph H Ch Minh Trng i Hc Khoa Hc T Nhin Khoa in T - Vin Thng Cao Hc Kho 23 Chuyn Ngnh in T, Vin Thng V My Tnh Tp. H Ch Minh, Thng 6 nm 2014 Slide 2 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Agenda 1. EIGENFACES overview 2. PCA 3. Results 4. Recent Advances in Face Recognition 5. References Slide 3 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Image Face Detection Face Feature Extraction Feature Matching Decision Maker Output Slide 4 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Image Face Detection Face Feature Extraction Feature Matching Decision Maker Output Slide 5 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Image Face Detection Face Feature Extraction Feature Matching Decision Maker Output Slide 6 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Image Face Detection Face Feature Extraction Feature Matching Decision Maker Output
Slide 7 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Face Feature Extraction Feature Matching Decision Maker Output Ana Bob John Puppy LiLy Slide 8 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Image Face Detection Face Feature Extraction Feature Matching Decision Maker Ouput Slide 9 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Image Face Detection Face Feature Extraction Feature Matching Decision Maker Output LiLy Slide 10 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Problems Each image is a n x m matrix of pixels. Convert it into a nm vector by stacking the columns. A small image is 100x100 -> a 10000 element vector, i.e. a point in a 10000 dimension space! Slide 11 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Problems Each image is a n x m matrix of pixels. Convert it into a nm vector by stacking the columns. A small image is 100x100 -> a 10000 element vector, i.e. a point in a 10000 dimension space! Slide 12 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin PCA To perform PCA several steps are undertaken: Stage 1: Subtract the Mean of the data from each variable.
Stage 2: Calculate and from a covariance Matrix.
Stage 3: Calculate Eigenvectors and Eigenvalue form the covariance Matrix.
Stage 4: Chose a Feature Vector ( a fancy name for a matrix of vectors).
Stage 5: Multiply the transposed Feature Vectors by the transpose adjusted data. Slide 13 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin STAGE 1: Mean Subtraction M = compute average vector Subtract M from each vector Zero centered distribution Slide 14 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin STAGE 2: Covariance Matrix compute covariance matrix C
= Slide 15 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin STAGE 3: Eigenvectors and Eigen values Slide 16 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin STAGE 4: Feature Vectors Slide 17 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin STAGE 5: Transposition The final stage in PCA is to take the transpose of the feature vector matrix and multiply it on the left of the transposed adjusted data set (the adjusted data set is from Stage 1 where the mean was subtracted from the data).
The Eigen Object Recognizer class performs all of this and then feeds the transposed data as a training set into a Neural Network. When it is passed an image to recognize it performs PCA and compares the generated Eigen values and Eigenvectors to the ones from the training set the Neural Network then produces a match if one has been found or a negative match if no match is found. The is a little more to it than this however the use of Neural Networks is a complex subject to cover and is not the object of this article. Slide 18 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Results Slide 19 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Results Slide 20 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Results Slide 21 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Results Slide 22 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Results Slide 23 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Results Slide 24 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin Recent Advances In Face Recognition A Google Scholar search for papers with the words face recognition yielded 1,100 papers in 2000 and 9,190 papers in 2012. Link: Recent Advances In Face Recognition 1. Image Compression in Face Recognition - a Literature Survey. 2. New Parallel Models for Face Recognition. 3. Robust Face Recognition System Based on a Multi-Views Face Database. 4. Face Recognition by Discriminative Orthogonal Rank-one Tensor Decomposition. 5. Intelligent Local Face Recognition. 6. Generating Optimal Face Image in Face Recognition System. 7. Multi-resolution Methods in Face Recognition. 8. Illumination Normalization using Quotient Image-based Techniques. 9. Liveness Detection for Face Recognition. 10. 2D-3D Mixed Face Recognition Schemes. 11. Recognizing Face Images with Disguise Variations. 12. Discriminant Subspace Analysis for Uncertain Situation. 13. Blood Perfusion Models for Infrared Face Recognition. 14. Discriminating Color Faces For Recognition. 15. A Novel Approach to Using Color Information in Improving Face Recognition Systems Based on Multi-Layer Neural Networks.
Slide 25 Phng Php Phn Tch Thng K Cao hc kho 23| i hc Khoa Hc T Nhin References [1] An Efficient Method For Face Recognition Using Principal Component Analysis (PCA), Dr. Tamilnadu, B. Dr. V.Cyril Raj.
[2] Performance Comparison of Principal Component Analysis-Based Face Recognition in Color Space, Seunghwan Yoo, Dong-Gyu Sim, Young-Gon Kim and Rae-Hong Park.
[3] Face Recognition: Where We Are and Where To Go From Here, Michael Jones.
[4] Recent Advances in Face Recognition, Kresimir Delac, Mislav Grgic and Marian Stewart Bartlett.
[5] Face Recognition using Principle Component Analysis, Kyungnam Kim.
[6] Recognizing faces with PCA and ICA, Bruce A. Draper,Kyungim Baek, Marian Stewart Bartlett and J. Ross Beveridge.