Good morning! Yesterday, we talked about the K-Nearest Neighbor algorithm, and today, we’ll proceed to another popular business analytics tool: the Naïve Bayes classifier. Naïve Bayes is quite similar to KNN, so these two can be easily confused. However, Naïve Bayes is a little bit more complex than KNN. Let’s discuss its main features and most common applications in the business world.
What Is Naïve Bayes?
Naïve Bayes, named after Reverend Thomas Bayes (1702-1761), is based on the Bayes Theorem that uses conditional probability to classify a new observation. Naïve Bayes determines the probability that a new observation belongs to a specific category.
Let’s define conditional probability. Conditional probability is the probability of occurrence of an event, given that another event has already occurred. For example, let’s assume that we have a friend, Bob, who is deciding whether to buy a new pet. We know that Bob already has a pet, and we can calculate the probability of Bob buying a new pet based on the decisions of our other friends. Below is the table that contains data on our other friends’ decisions regarding buying a new pet.
|Friend’s name||Already has a pet?||Bought a new a pet?|
First, let’s determine how many of our friends have bought a new pet recently and calculate the probability of buying and not buying a new pet. That would be: P(bought a pet = YES) = 4/7 and P(bought a pet = NO) = 3/7.
Then, let’s proceed to calculating the probability of already having a pet, given that a friend has recently bought a new pet. That would be: P(has a pet = YES | bought a new pet = YES) = 3/4. The same applies when we calculate the probability of not having a pet: P(has a pet = NO | bought a new pet = YES) = 1/4.
In a similar way, we can calculate the probability of having/not having a pet, given that a friend has not bought a new pet. That would be: P(has a pet = YES | bought a new pet = NO) = 1/3 and P(has a pet = NO | bought a new pet = NO) = 2/3.
Next, let’s multiply probabilities of buying a new pet, given that a friend already has/doesn’t have a pet: 3/4 * 1/3 = 1/4. Then, we need to multiply this probability by P(bought a pet = YES): 1/4 * 4/7 = 1/7.
Then, let’s multiply probabilities of not buying a new pet, given two other conditions: 1/4 * 2/3 = 1/6. Let’s multiply this new probability by P(bought a pet = NO): 1/6 * 3/7 = 1/14.
Finally, we need to divide each of the two probabilities by the probability of already having a pet (since Bob already has one). The probability of having a pet is P(has a pet = YES) = 4/7. The final probability of buying a new pet, given that a person doesn’t have a pet is: P(bought a new pet = YES | has a pet = YES) = (1/7) / (4/7) = 1/4. Similarly, the probability of not buying a new pet, given that a person doesn’t have one is: P(bought a new pet = NO | has a pet = YES) = (1/14) / (4/7) = 1/8.
One quarter is greater than one eighth. Looks like Bob will buy a new pet.
This is the main idea behind Naïve Bayes: You calculate conditional probability to classify new observations.
Most Popular Applications of Naïve Bayes
Here are the most popular examples of the application of Naïve Bayes:
• Fraudulent financial reporting. Accounting and financial firms can predict trustworthiness of a financial report of their customers based on their customers’ previous legal issues.
• Flight delays. Airlines use Naïve Bayes to predict flight delays by using data on flight destination, time, and day of the week.
• Spam filtering. Spam filtering of your email uses Naïve Bayes as well. The conditional probability of a spam email is based on the usage of specific words (i.e., “free”) in an email.
That’s it for today! Tomorrow, we will talk about association rules in business analytics.
Share with friends