machine learning - Bayes Classifier -
hello, totally confused in deriving bayes classifier. normally, given problem 1 above number of red dots , green dots , calculate feature distribution f(x). how can same when binary distribution graph below:
here class variable y∈(red,blue) , feature variable x∈(-4,4). joint distribution of p(x,y) shown in plot. (p(x,y=blue) , p(x,y=red)). now, how can derive , plot feature distribution p(x).
intuitively, it's not hard see p(x)
depends on p(y)
. example, if p(y=blue) = 1
, then
p(x) = p(x | y=blue)
(in words, if know y blue, probability density plot x
given blue plot posted). , similarly, if p(y=red) = 1
, then
p(x) = p(x | y=red)
since y
binary class variable, distribution can specified single probability, p(y=blue) = p
, since imply p(y=red) = q = 1-p
.
given result above, it's not hard believe if p(y=blue)
other 1, p(x)
should mixture of p(x | y=blue)
, p(x | y=red)
. in fact, makes sense should linear mixture:
p(x) = p * p(x | y=blue) + q * p(x | y=red)
you can prove using bayes' theorem:
p(x) * p(y=blue | x) = p(y=blue) * p(x | y=blue) p(x) * p(y=red | x) = p(y=red) * p(x | y=red)
adding 2 lines together,
p(x) * [p(y=blue | x) + p(y=red | x)] = p(y=blue) * p(x | y=blue) + p(y=red) * p(x | y=red)
since y
must either red or blue, p(y=blue | x) + p(y=red | x)
must equal 1, bracketed expression drops out , get:
p(x) = p(y=blue) * p(x | y=blue) + p(y=red) * p(x | y=red) p(x) = p * p(x | y=blue) + q * p(x | y=red)
Comments
Post a Comment