machine learning - Bayes Classifier -


enter image description here

hello, totally confused in deriving bayes classifier. normally, given problem 1 above number of red dots , green dots , calculate feature distribution f(x). how can same when binary distribution graph below:

enter image description here

here class variable y∈(red,blue) , feature variable x∈(-4,4). joint distribution of p(x,y) shown in plot. (p(x,y=blue) , p(x,y=red)). now, how can derive , plot feature distribution p(x).

intuitively, it's not hard see p(x) depends on p(y). example, if p(y=blue) = 1, then

p(x) = p(x | y=blue) 

(in words, if know y blue, probability density plot x given blue plot posted). , similarly, if p(y=red) = 1, then

p(x) = p(x | y=red) 

since y binary class variable, distribution can specified single probability, p(y=blue) = p, since imply p(y=red) = q = 1-p.

given result above, it's not hard believe if p(y=blue) other 1, p(x) should mixture of p(x | y=blue) , p(x | y=red). in fact, makes sense should linear mixture:

p(x) = p * p(x | y=blue) + q * p(x | y=red) 

you can prove using bayes' theorem:

p(x) * p(y=blue | x) = p(y=blue) * p(x | y=blue)  p(x) * p(y=red | x) = p(y=red) * p(x | y=red) 

adding 2 lines together,

p(x) * [p(y=blue | x) + p(y=red | x)] = p(y=blue) * p(x | y=blue) + p(y=red) * p(x | y=red) 

since y must either red or blue, p(y=blue | x) + p(y=red | x) must equal 1, bracketed expression drops out , get:

p(x) = p(y=blue) * p(x | y=blue) + p(y=red) * p(x | y=red)  p(x) = p * p(x | y=blue) + q * p(x | y=red) 

Comments

Popular posts from this blog

windows - Single EXE to Install Python Standalone Executable for Easy Distribution -

c# - Access objects in UserControl from MainWindow in WPF -

javascript - How to name a jQuery function to make a browser's back button work? -