Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 8

NPTEL

Video Course on Machine Learning

Professor Carl Gustaf Jansson, KTH

Week 6 Machine Learning based


on Artificial Neural Networks

Video 6.9 Deep Learning and further developments


Deep Learning

A Climax or an Anticlimax ?
Deep Learning
Result-wise a Climax
What is normally referred to as Deep Learning has produced more
success stories than other parts of Machine Learning sofar

Rhetorically an Anticlimax

We have this week already talked about almost everything that is


of relevance for the concept of Deep Learning
Not much more to add!
Deep Learning - important points on the time-line
1959 Arthur Samuel coined the term Machine Learning. Since the beginning Machine Learning has developed hand in
hand with other Subareas of Artificial Intelligence

1957 Pioneering work on ANN by : the Perceptron

1986 The revival of ANN by Rumelhart and Hinton

1986 Rina Dechter introduced the term Deep Learning for multilayered symbolic machine learning,
NOT in the context of ANNs.

1986-1993 Introduction of RNN

1987-1989 Introduction of CNN

2000 The term Deep Learning was first used in the ANN context by Igor Aizenberg and colleagues.

2001 The systematic use of specialized hardware (GPUs) for ANN, RNN and CNN.

2012 The use of term took off after the Alexnet breakthrough in the Imagenet challenge by Hinton, Krizhevsky and Sutskever.

2018 Dominates as a term referering to work on ANN in Machine Learning.


Deep Learning

Yann LeCun Yoshua Bengio Geoffrey Hinton

Review in NATURE | VOL 521 | 28 MAY 2015


https://creativecoding.soe.ucsc.edu/courses/cs523/slides/week3/DeepLearning_LeCun.pdf
Feed Forward Multiple Layer
´Deep learning allows computational models that are composed of Artificial Neural Networks (ANN)
multiple processing layers to learn representations of data with with Backpropagation.
multiple levels of abstraction.

These methods have dramatically improved the state-of-the-art in speech


recognition, visual object recognition, object detection and many other
domains such as drug discovery and genomics.
Convolutional Recurrent
Deep learning discovers intricate structure in large data sets by using the Neural Networks Neural Networks
backpropagation algorithm to indicate how a machine should change
(CNN) (RNN)
its internal parameters that are used to compute the representation in each
layer from the representation in the previous layer.
Deep convolutional nets have brought about breakthroughs in processing
images, video, speech and audio, whereas recurrent nets have shone
light on sequential data such as text and speech.´
Preferred Terminology

Computer Artificial
Science Intelligence Machine
Learning

Deep Learning Data


Science
Combination of
ANN, RNN and CNN

Big Data
Deep Learning – Recent Developments
- Learning of Structures and Features in Representation

- Re-engineering of Symbolic descriptions from Learned ANN/RNN/CNN – Disentangled Representations

- Integration of Associative Memory approaches with ANN/RNN/CNN, e.g Deep Belief Networks

- Integration of bayesian network approaches and ANN/RNN/CNN

- Further use of ANN/RNN/CNN for the purpose of Re-inforcement Learning

- Scaling up of applications in time-critical and safety-critical applications like selfdriving vehicles

- Handling of really BIG DATA.

- Enhanced utilization of specialized hardware

- Consolidation and open access to toolboxes and software support systems.


NPTEL

Video Course on Machine Learning

Professor Carl Gustaf Jansson, KTH

Thanks for your attention!

The next lecture 6.10 will be on the topic:

Tutorial on assignments

You might also like