Professional Documents
Culture Documents
14. VGG
14. VGG
Deep Learning
300 (3)
Dr. Lekshmi R. R.
Asst. Prof.
Department of Electrical & Electronics
Engineering
Amrita School of Engineering 1
VGG 16
Neural Network
• VGG – by Simonyan and Zisserman
• Includes 16 convolutional layers
• Complex than LeNet
• Preferred cos of uniform architecture
• Involves huge number of parameters
– 138 million parameters
Input 3х3 Conv, 64
Convolution 1-1
Convolution 2-1 3х3 Conv, 64
Layers Pooling
Convolution 2-1
Pool 1/2
3х3 Conv, 64
Convolution 2-2 3х3 Conv, 64
Pool 1/2
Pooling
3х3 Conv, 64
Convolution 3-1
Convolution 3-2 3х3 Conv, 64
Convolution 3-3 3х3 Conv, 64
Pooling Pool 1/2
Convolution 4-1
3х3 Conv, 64
Convolution 4-2 3х3 Conv, 64
Convolution 4-3 3х3 Conv, 64
Pooling Pool 1/2
3х3 Conv, 64
Convolution 5-1
224
• Filter: 64
– Size: 3x3
• Need output:224x224 226
• So, Padding: 1
• Stride: 1
Layer 1, 2
• Input:224x224x64
• Size: 2x2
• Stride: 2
• Output:112x112x64
Layer 3, 4
• Input:4096
• Neurons: 4096
16 Layers require learning
2
3
3
3 1 1 1
Thank you