0% found this document useful (0 votes)
52 views2 pages

Naive Bayes 5

Uploaded by

Mark Mamdouh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views2 pages

Naive Bayes 5

Uploaded by

Mark Mamdouh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Naive Bayes Example:

So, let's say we have data on 1000 pieces of fruit. The fruit being a Banana, Orange or some Other fruit
and imagine we know 3 features of each fruit, whether it’s long or not, sweet or not and yellow or not,
as displayed in the table below:

So from the table what do we already know?


▪ 50% of the fruits are bananas
▪ 30% are oranges
▪ 20% are other fruits
Based on our training set we can also say the following:
▪ From 500 bananas 400 (0.8) are Long, 350 (0.7) are Sweet and 450 (0.9) are Yellow
▪ Out of 300 oranges 0 are Long, 150 (0.5) are Sweet and 300 (1) are Yellow
▪ From the remaining 200 fruits, 100 (0.5) are Long, 150 (0.75) are Sweet and 50 (0.25)
are Yellow
Which should provide enough evidence to predict the class of another fruit as it’s introduced.
So let’s say we’re given the features of a piece of fruit and we need to predict the class. If we’re told
that the additional fruit is Long, Sweet and Yellow, we can classify it using the following formula and
subbing in the values for each outcome, whether it’s a Banana, an Orange or Other Fruit. The one with
the highest probability (score) being the winner.
In this case, based on the higher score 0.01875 < 0.252 we can assume this Long, Sweet and Yellow
fruit is, in fact, a Banana.

You might also like