Professional Documents
Culture Documents
00 Pytorch and Deep Learning Fundamentals PDF
00 Pytorch and Deep Learning Fundamentals PDF
Machine
Learning
Deep
Learning
fi
2. Season chicken
3. Preheat oven
4. Cook chicken for 30-minutes
5. Add vegetables
1. Cut vegetables
algorithm
2. Season chicken
3. Preheat oven
4. Cook chicken for 30-minutes
5. Add vegetables
ff
y p ic a l l y )
(t
What deep learning is not good for 🤖🚫
• When you need explainability—the patterns learned by a deep
learning model are typically uninterpretable by a human.
Deep Learning
d ient
g r a
r i t h m:
Alg o
a c h ine ral
st e d m : n e u
bo o it hm
o r
Alg rk
e t w o
n
p r o p r i a te
t h e a p
(choo s e
o rk fo r y o u r “Hey Siri, what’s
n e t w
n e u r a l the weather
problem)
today?”
Learns
Numerical representation Representation
Inputs Outputs
encoding (patterns/features/weights) outputs
Note: “patterns” is an arbitrary term, you’ll often hear “embedding”, “weights”, “feature representation”,
“feature vectors” all referring to similar things.
Types of Learning
(s o m e )
Deep Learning Use Cases
“What is PyTorch?”
What is PyTorch?
• Most popular research deep learning framework*
• Write fast deep learning code in Python (able to run on a GPU/many
GPUs)
Why PyTorch?
Research favourite
Source: paperswithcode.com/trends February 2022
Why PyTorch?
P y Torc h
an d
These a Ramen,
(before data gets used
re tens Spaghetti
with an algorithm, it
needs to be turned into
numbers)
o rs!
[[116, 78, 15], [[0.983, 0.004, 0.013],
[117, 43, 96], [0.110, 0.889, 0.001], Not spam
[125, 87, 23], [0.023, 0.027, 0.985],
…, …,
p r o p r i a te
t h e a p
(choo s e
o rk fo r y o u r “Hey Siri, what’s
n e t w
n e u r a l the weather
problem)
today?”
Learns
Numerical representation Representation
Inputs Outputs
encoding (patterns/features/weights) outputs
These a
re tens
o rs!
[[116, 78, 15], [[0.983, 0.004, 0.013],
[117, 43, 96], [0.110, 0.889, 0.001], Ramen,
[125, 87, 23], [0.023, 0.027, 0.985], Spaghetti
…, …,
Learns
Numerical representation Representation
Inputs Outputs
encoding (patterns/features/weights) outputs
• PyTorch basics & fundamentals (dealing with tensors and tensor operations)
• Later:
👩🍳 👩🔬
(we’ll be cooking up lots of code!)
How:
1. Code along
b t, ru n the c o d e! 2. Explore and 3. Visualize what you
Motto #1 : if in d o u
experiment don’t understand
(including the
“dumb” ones)
🛠 🤗
4. Ask questions 5. Do the exercises 6. Share your work
🔥🔥
Avoid: 🔥🧠 “I can’t learn
______”
This course Resources
Course materials Course Q&A Course online book
https://www.github.com/mrdbourke/pytorch-deep-learning/
https://www.github.com/mrdbourke/pytorch-deep-learning https://learnpytorch.io
discussions
tensor([[[1, 2, 3], 0
dim=1 [3, 6, 9], 1 torch.Size([1, 3, 3])
[2, 4, 5]]]) 2
tensor([[[1, 2, 3],
dim=2 [3, 6, 9],
[2, 4, 5]]])
0 1 2
Dot product
A B C J K A*J + B*L + C*N A*K + B*M + C*O
5 0 3 4 7 5 0 3 44 38
torch.matmul( 3 7 9 6 8 ) * * * 126 86
,
4 6 8
3 5 2 8 1 58 63
= = =
3x3 3x2 3x2
20 + 0 + 24 = 44
Learns
Numerical representation Representation
Inputs Outputs
encoding (patterns/features/weights) outputs
Tensor attributes
Attribute Meaning Code