Professional Documents
Culture Documents
ML Unit 3 MID1
ML Unit 3 MID1
P(A|B) – Hypothesis
P(B|A) – Likelihood
P(B) – Marginal
P(A) - Prior
P(A/B).P(B) = P (A∩B) – Eq 1
P(B/A).P(A) = P (B ∩A) – Eq 2
P (A/B).P(B) = P(B/A).P(A)
Bayes Theorem
A – Hypothesis
B – Given Data
P(A/B) = Finding probability of Hypothesis when probability of Training examples are given.
P (B/A) = Finding probability of given data provided with probability of Hypothesis that is true
P (A) = Probability of Hypothesis before observing given data
P (B) = Probability of given data
Bayes Theorem Calculates the probability
of each possible hypothesis and outputs
the most probable one.
Hmap or hml
p – probability Density Function
Hmap or Hml
-> Product
µ - Mean
– Standard Deviation
X- Variable or Input
Due to Negative Symbol
argmax changed to argmin
•Minimum Length/Short
Hypothesis is Required
•To Convert max to min
negative symbol is added
Let us Consider the problem of designing a code to transmit messages drawn at random
from a set D Where probability of drawing an i th message = Pi
While transmitting we want a code that minimises the expected number of bits.
To do this we should assign shorter codes to the most probable, We represent the length
of message i with respect to c as
Actual Value