Housing Prices
Linear regression (Portland, OR)
Model °
representation Supervised Learning Regression Problem
each example im the data
Clason! Disecter tlk oa
housing prices
How do we representh ?
(Portland, OR) heW) = Co + Ox
Shatlaa KA
Notation & ES
ins Number of tiing examples
xs "input” variable Features % 2 2104
Yes ronpur varaie Pare verble | 5) 14 1G
(aig) = 208 ong, Creme wo.
Pe) == 440
(2p D~ 2 teseege | REATraining Set Size in feet? (x) | Price ($} in 1000's (y)
Linear regression ie SS meq
with one variable ra te
Cost function
Hypothesis: ho(x) = @)+@r
Os: parameters
Machine Learning How to choose 0:5?
hx) = 8 + O10
hie 162 ; Wgazose 2 hots)
1 1 1 het a
3 od L. C9) Tew.py-k Zu HT
° ° , Idea: Choose 90.41 so that
282, seat higla)isciose to yforour — Ngmge 3 (S48)
Training examples(, y oct Bastien
xy Squrdd err ProttonSimplified
Hypothesis:
> Linear regression a(x) = % + 812 hale) = Oe
‘Mthonevariable mg [ = GF Ilo, 81)]itnersty wate
eta =Oand j=1
FES i j=)
}
Leary Qakenive
ate’
vn Bley
win SLO) eRs ste) (eR) ae + 0)
x a
(Se, AEs] 4 = 6, Opp Jn)
LF Ss Faro eal raises 7
Sr Cet munte) 4
\faistoa aga, aadiant descent
can overshoct the minum itmay | >?
fate converge, or even dherge
Gradient descent
n converge to a local
minimum, even with the learning rate a fixed,
4 fF 76]
,atlecaloatina weed s1@)
a minimum, gradient
mentale of ~ a“ J() descent will automatically
take smaller steps. steps. $0, no. no
need to decrease cover
timeGradient descent algorithm
Linear regression
e with one variable
repeat until eorergence{
0, = 8) ~ agp M0)
Gradient descent for ea a)
linear regression }
Machine Learning
Gradient descent algorithm 2
Fo.)
repeat until convergence {
update
Oo anc
simultaneouslyhol) J(0,81)
Comey Posctin” etn (octane 1
Bok dopk 5%
Eepoe)
F(G0, 1)
pion 0
] FN
ee
ttantion 1B
oe. F(G0, 1)
SA
PN|
i
oo
ho(x)
Bt stant)
rc es
F(80591)
aon 040
—*A\
\
aSs\
h(x) J (80,81)
E- \“Batch” Gradient Descent
“Batch’: Each step of gradient descent
uses all the training examples.
TE (Ale) y)