Naive Bayes Classifier
Last updated
Last updated
Say our data has d attributes/features, and that these attributes are binary. Also, we have a finite set of classes C.
The main issue here is that we do not know P(X|C) in advance, and it is difficult, if not impossible, to model the joint distribution of all the d attributes. Therefore, we use the simplifying assumption that the attributes are conditionally independent i.e. the attributes are independent, given the class. This assumption is also called class-conditional independence.
So,
Therefore, we calculate as:
Since we need to compute a product of probabilities, we must prevent any of the individual probabilities from being 0. This would nullify the effect of all the other probabilities.
To do so, we use smoothing.
If N is the total number of examples having a certain class C, and t is the number of times , then the non-smoothed estimate is given by:
If t=0, this probability becomes 0!
So, we perform add-m smoothing:
(where s is the number of possible values for the attribute ). Sometimes, we use m=1. This is called add-1 smoothing. In most cases, we limit m to the range (0,1].