Professional Documents
Culture Documents
Machine Learning
Machine Learning
explicitly programmed
octave a Computerlearns Experience
if E taskt Performance measure P
MATLAB
if ont as by P improves with E
performance measured
is i
teacakes'gipitet
Computer learns
by itself
classification type
Reinforcement I recommender
gym
straightline
quadratic
Linear regression
examples
M number of
X input
Output
contre gigot's algorithm
y 00,0
with
start
example
keep
gotin
c hanging a
reachingmmmm
xy training untilconvergence
repeat
ÉGq
stiffsmate
4ity go my
inningend
hypothesis
asweapproach man
step
decreases
slope
because
if aisfixed
even
decrease
scan
ftp gession
a
E o
ElchoExityes
function
Convex
50,0
50,91 II Cho yo x
method
normal equations
with multiplevariables featurescaling
Regression
multivariate
amber
of
features
x getevery into
feature Exist range and
trainingexample
Crow
yes mean normalization
D feature
Xo L newfeature
hold Oxotaxi tax
maxmin
Ex co Stadeviation
same
descent
Costfunction gradient
f
moregeneral
g
Debugging smelled
descentnot
gradient
decreased
working
a
fuse
Nog
iterations
Y roof
iterations WY
automatic convergencetest
declareaversions ofJoy by
decreases
iteration
lessthan 103 inone
too
much
features
Normal equation
polynomial other
regression
3
howO EX EX 03 of d
newfeatures
creating xox x 1,3 different
ranges
I
Xifrontage
I
fatescaling
apply mind Y Ma
feature
depth isn'tnecessary
scaling grad.at none
Descent
m
Tx Ey tochooses
need
many
iterations no
nor
iterations
warn
better slowinarm
if
largecreatures a
fy too
s
I
gfftihssification
i
he
YI hate
gas Is
peg11 0 Convexity
analysis g boundary haveo
how os
how
cottony
Its i www ii ii
i
Oj Oj ago510
7
D 0 a hox4 y
x
lifegession
Malticlass classification
algorithms one vs all
optimization
Conjugate
gradient Intimating
BEGS faster
moreComplex
L BFGs libraryinoctave
function Inter
minimization 1 2
ER'd
unconstrained
Y Max n'I
Regularization
classification
overfilling 1
1 t
É
x x xx
fitsthetrainingexamplesbutnot
a generalized relation
features Regularization
reduce number
of Keepall parameters but
reducemagnitude
manually
modelselectionalgorithm
y Et
regularized
Jo Eilat 4 linearregression
regularization
Parameter
iftoolargeallwill
by
tisYY.tEhorizontdli
Normal equations
jffLIIE.tw ym an o Exel I xt
0 at ECh Y y 4x4 EQ
0 t Heiko
Q D I am at EChoix 4444
Ip I
logistic
I
loglhocxgll.my log g
hogaggression
310 IET
Ethocxg.yuyx
go.gg 1 fdffpt.TT g go
E 0 a EEChoixs
yiyxi.ci go
n
dth
O weights
dutput
function
activation
sigmoid logistics
neural network
matrix
ofweights controlling
mapping
i
Sj witsin layer snuitsinlayeries
0 Sj sits
QE 1234
layers
layers
lyrical T
T hidden
output
input layer layer
layer
Vectorized implementation
129
z ZEO x
31 a s g zag
A
1 21gal
how a go
It
X2
9 302 4
It t
sIig
j g Lumberof layers
s number
ofwritsin layerl
withoutbias
K outputnots
reglerizid
cation
IIg g
regression
uliylogli.hokfg
loglo.ci ll
Teos IET
540,91
Jo E.SI llogchocxB.tf giillogci ho.cxHJEm
Backpropagationdgorithm
error
ofrodejin
lager l at
I
all y 55 9 Tej
aa
8 omfg gg
element
operation jya
gigso ai's ignoring
sets o
to
giggles
iiisittin
atra
3 hi
backpropagation
wowing
Vector Matrix
I
needed
for
optimization
algorithms
9Imhf.si ngiiggradiet
taylor
oe
ItgI
compare
withD
architecture
I hiddenlayer
if ilayersit's ofunits
preferabletonavesanero
get
more examples
training
less
I
intianuegitscol
randomly try features
featwent
forapropeptiontogethon s tymoreseatrescaren
engl
normative
catoconaeostfaction polynomial
bacaproptoonatepatidaerivatives features
gradient
checking changingx
todetectoverfilling
how
Testseteradon Biasvariance
ortrainingsetso gggygfiggggg
bias underst
high overfit
highvarina
apitetsterror zedo.gg
errorabias
misclassificationerrorin ggariana
errchocxs.gs
literroro otherwise
1
Ydsgjg.ly tihbestJtespofunction
soxcrossuditation.ro
Gottraining
testse Jtainc01
Jaco.stestlo