Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 5

Formal Model and Empirical

Risk Minimization
Dr. Shahid Hussain (In-charge SE Program)
G42, Ground floor, CS dept., CIIT

Email: shahidhussain2003@yahoo.com
Contact#: 03339124427
Formal model : Statistical Learning Model
• Domain Set
• Label Set
• Training Data

• Learner’s Output
• Prediction Rule
• Aka Predictor/Hypothesis/Classifier
• represent a hypothesis that a learning algorithm A returns
upon receiving a training sequence S
• Simple Generation Model
• Probability Distribution over
Formal model : Statistical Learning Model
• Labelling function Represented as
• Conclusion: Each pair in training data is generated sampling from
a point according to and then labelling it by .
• Measure of Success
• Error of a Classifier : Refer to the probability that it does not
predict the correct label on a random data point
• Error of is the probability to draw a random instance
• So, according to distribution such that does not equal to
• Domain Subset :
• Probability Distribution :
• Assigning a Number : to determine how likely it is to
Formal model : Statistical Learning Model
• Aka
• Notation used to express
• So, Error of a Prediction rule can be express as

• has several names, such as Generalization Error, Loss


of Learner, Risk, True error of
• Information Available to Learner
• Learner is blind and depends on distribution over the world
and labelling function .
Empirical Risk Minimization
• Learning algorithm receive an input as a training set as a sample
from unknown distribution and labelled by target function .and
output should be a predictor
• Goal of an algorithm is to find which should minimize the error
with respect to unknown and .
• Since and are unknown, so true error cannot found.
• Error of learner appear as a training error, such as

• The terms empirical error and empirical risk are used


interchangeable for training error.
• A learning paradigm should come up with a predictor that minimize
and called ERM

You might also like