Birla Institute of Technology & Science, Pilani: Work Integrated Learning Programmes

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

BIRLA INSTITUTE OF TECHNOLOGY & SCIENCE, PILANI

WORK INTEGRATED LEARNING PROGRAMMES


COURSE HANDOUT

Part A: Content Design

Course Title Deep Learning


Course No(s)
Credit Units #
Course Author Joy Mustafi and Lakshya Kumar
Version No Draft
Date 1 May, 2019

Course Objectives

No. Objective

CO1 Learn basics of deep learning and its application to various AI tasks.

CO2 Gain significant familiarity with deep learning and apply deep learning to a variety of tasks.

CO3 Understand much of the current literature on deep learning and extend their knowledge with further study

Text Book(s)

T1 Ian Goodfellow, Yoshua Bengio, Aaron Courville; ​Deep Learning​, MIT Press
T2 Aston Zhang, Zack C. Lipton, Mu Li, Alex J. Smola; ​Dive into Deep Learning

Reference Book(s) & Other Resources

R1 [CONFERENCE] NIPS, IJCNN, ICML, ECML, ACML, ICONIP, etc.

Content Structure

1. Introduction
1.1. Historical Background
1.2. Example of Deep Learning Problems
1.3. Basic Building Block of Learning Machines
1.4. Challenges in Modeling Learning Problems
1.5. Neuron Architecture

2. Deep Feedforward Network


2.1. Perceptron
2.2. Loss functions
2.3. Forward and Back propagation
2.4. Gradient-Based Learning
2.5. Architecture Design
3. Regularization Techniques
3.1. Constrained and Unconstrained Problems
3.2. Early Stopping
3.3. Parameter Tying and Sharing
3.4. Bagging and related Techniques
3.5. Dropout

4. Optimization for Deep Learning


4.1. Challenges
4.2. Basic Algorithms
4.3. Parameter Initialization
4.4. Adaptive Learning rates and second-order methods
4.5. Optimization Strategies

5. Convolutional Neural Networks (CNN)


5.1. Learning and Visualizing CNN
5.2. Operation: Convolution, Pooling
5.3. Convolutional Backpropagation
5.4. Variants of CNN
5.4.1. ResNet
5.4.2. AlexNet
5.4.3. ImageNet
5.4.4. VGG
5.4.5. Inception
5.5. Motivations: Neuroscientific, Efficiency, Equivariance, Invariance, Parameter tying
5.6. Capsule Networks

6. Sequence Learning
6.1. Recurrent Neural Networks (RNN)
6.2. Design Patterns for RNNs
6.3. Long short-term memory (LSTM) and gated architectures (GRU)
6.4. Attention Models: Applications in Machine Translation and Caption Generation

7. Autoencoders
7.1. Linear Projections and Linear Autoencoders
7.2. PCA, ICA
7.3. Over- and under-complete autoencoders
7.4. De-noising autoencoders

8. Generative Models
8.1. Boltzmann Machine and variants
8.2. Deep Belief Network
8.3. Generative Adversarial Networks (GAN)

9. Overview of Deep Learning Framework


9.1. Different kinds of Deep Learning Frameworks
9.2. Trends related to DL Frameworks
9.3. Basics of TensorFlow
9.4. TensorFlow Tutorial

10. Applications of Deep Learning


10.1. Computer Vision and Graphics
10.1.1. Object Detection/Recognition
10.1.2. Scene Classification
10.2. Speech Processing
10.2.1. Speaker recognition
10.2.2. Speech recognition
10.3. Natural Language Processing
10.3.1. Machine Translation
10.3.2. Sentiment Analysis
10.3.3. Question-Answering
10.3.4. Summarization

Learning Outcomes:

No Learning Outcomes

LO1 Study and analysis of Deep Learning algorithms

LO2 Study of mathematics techniques and software tools for Deep Learning

LO3 Study and analysis of Feed forward networks, CNN, RNN, Auto-encoders, Deep generative model,
adversarial machine learning etc.

LO4 Study and analysis of some applications of Deep Learning

Part B: Contact Session Plan

Academic Term
Course Title Deep Learning
Course No
Lead Instructor Joy Mustafi and Lakshya Kumar

Course Contents

Contact List of Topic Title Text/Ref Book/external


Sessions(#) (from content structure in Course Handout) resource

1 Introduction T1 - Ch1
Historical Background, Example of Deep Learning Problems,
Basic Building Block of Learning Machines, Challenges in
Modeling Learning Problems, Neuron Architecture

2 Deep Feedforward Network T1 - Ch6


Perceptron, Loss functions, Forward and Back propagation,
Gradient-Based Learning​, ​Architecture Design

3 Regularization Techniques T1 - Ch7


Constrained and Unconstrained Problems, Early Stopping,
Parameter Tying and Sharing, Bagging and related Techniques,
Dropout

4 Optimization for Deep Learning T1 - Ch8


Challenges, Basic Algorithms, Parameter Initialization,
Adaptive Learning rates and second-order methods,
Optimization Strategies

5 Convolutional Neural Networks (CNN) T1 - Ch9


Learning and Visualizing CNN, Operation: Convolution,
Pooling, Convolutional Backpropagation, Variants of CNN,
ResNet, AlexNet, ImageNet, VGG, Inception, Motivations:
Neuroscientific, Efficiency, Equivariance, Invariance, Parameter
tying, Capsule Networks

6 Sequence Learning T1 - Ch10


Recurrent Neural Networks (RNN), Design Patterns for RNNs,,
Long short-term memory (LSTM) and gated architectures
(GRU), Attention Models: Applications in Machine Translation
and Caption Generation

7 Autoencoders T1 - Ch14
Linear Projections and Linear Autoencoders, PCA, ICA, Over-
and under-complete autoencoders, De-noising autoencoders

8 Generative Models T1 - Ch20


Boltzmann Machine and variants, Deep Belief Network,
Generative Adversarial Networks (GAN)

9 Overview of Deep Learning Framework Web References


Different kinds of Deep Learning Frameworks, Trends related
to DL Frameworks, Basics of TensorFlow, TensorFlow Tutorial

10 Applications of Deep Learning T1 - Ch12


Computer Vision and Graphics, Object Detection/Recognition,
Scene Classification, Speech Processing, Speaker recognition,
Speech recognition, Natural Language Processing, Machine
Translation, Sentiment Analysis, Question-Answering,
Summarization

Evaluation Scheme​: ​To be Discussed

Legend: EC = Evaluation Component; AN = Afternoon Session; FN = Forenoon Session

No Name Type Duration Weight Day, Date, Session, Time


EC-1 Quiz-I Online - 5% To be announced
Quiz-II Online - 5% To be announced
Assignment-I Online - 7% To be announced
Assignment-II Online - 8% To be announced
EC-2 Mid-Semester Test Closed 2 hours 30% To be announced
Book
EC-3 Comprehensive Exam Open Book 3 hours 45% To be announced
Note​ - Evaluation components can be tailored depending on the proposed model.

Important Information ​To be Discussed

Syllabus for Mid-Semester Test (Closed Book): Topics in Weeks 1-7


Syllabus for Comprehensive Exam (Open Book): All topics given in plan of study

Evaluation Guidelines:
1. EC-1 consists of two Assignments and two Quizzes. Announcements regarding the same will be made
in a timely manner.
2. For Closed Book tests: No books or reference material of any kind will be permitted. Laptops/Mobiles
of any kind are not allowed. Exchange of any material is not allowed.
3. For Open Book exams: Use of prescribed and reference text books, in original (not photocopies) is
permitted. Class notes/slides as reference material in filed or bound form is permitted. However, loose
sheets of paper will not be allowed. Use of calculators is permitted in all exams. Laptops/Mobiles of
any kind are not allowed. Exchange of any material is not allowed.
4. If a student is unable to appear for the Regular Test/Exam due to genuine exigencies, the student should
follow the procedure to apply for the Make-Up Test/Exam. The genuineness of the reason for absence
in the Regular Exam shall be assessed prior to giving permission to appear for the Make-up Exam.
Make-Up Test/Exam will be conducted only at selected exam centres on the dates to be announced
later.

It shall be the responsibility of the individual student to be regular in maintaining the self-study schedule as
given in the course handout, attend the lectures, and take all the prescribed evaluation components such as
Assignment/Quiz, Mid-Semester Test and Comprehensive Exam according to the evaluation scheme provided in
the handout.

You might also like