0% found this document useful (0 votes)
43 views15 pages

PROBABILITY

The document provides a comprehensive overview of Bayes' Theorem, including its introduction, relation to conditional probability, derivation, and real-world applications such as medical diagnosis and spam filtering. It discusses the advantages and limitations of the theorem, emphasizing its role in updating beliefs based on new evidence. A case study illustrates its application in assessing the probability of depression based on psychological test scores.

Uploaded by

Aastha Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views15 pages

PROBABILITY

The document provides a comprehensive overview of Bayes' Theorem, including its introduction, relation to conditional probability, derivation, and real-world applications such as medical diagnosis and spam filtering. It discusses the advantages and limitations of the theorem, emphasizing its role in updating beliefs based on new evidence. A case study illustrates its application in assessing the probability of depression based on psychological test scores.

Uploaded by

Aastha Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

BAYES THEOREM

Introducing the team


Member
• Souroja Roy IPL03054
• Annshika Bakshi IPL03011
• Aastha Sharma IPL03003
• Cherrie Gupta IPL03017
• Siya Saurabh IPL03053
Contents
1 Introduction to Bayes Theorem

2 Relation to conditional probability

3 Derivation of its formula

4 Example
s
5 Real world application

6 Advantage

7 Limitation

8 Case Study

9 Conclusio
n
INTRODUCTION

Imagine a detective investigating a crime.


They have some initial beliefs about the
suspect (prior probability) but gather new
evidence at the scene (evidence). Bayes'
theorem allows them to refine their
suspicions based on how likely the
evidence is given the suspect (likelihood)
and how common the evidence is in
general (marginal probability).
THOMAS BAYES

The man credited with discovering Bayes' theorem is


Reverend Thomas Bayes, an English Presbyterian minister
and mathematician who lived from 1702 to 1761.
However, it's important to note two things:

Bayes Theorem
His work, including Bayes' theorem, was published
posthumously in 1763 by his friend Richard Price.
Some scholars point to Pierre-Simon Laplace as having
independently rediscovered and further developed the
theorem later in the same century.
So, while Thomas Bayes gets the official credit, the
development of this important theorem involved
contributions from multiple minds.
Relation to conditional probability

Bayes' theorem is built upon the foundation


of conditional probability, which explores the
likelihood of one event occurring given that
another event has already happened.
For example, given that it is cloudy today,
the probability of it raining tomorrow is a
conditional probability.

Bayes' theorem takes this concept further by


allowing us to update our belief about the
probability of a proposition (hypothesis),
taking into account new evidence or data. It
essentially answers the question: "How
should I adjust my initial belief about
something based on what I've just learned?"
Derivation of its
formula
Examples
Q) A man is known to speak truth 3 out of 4 times. He throws
a die and reports that it is a six. Find the probability that it is
actually a six.
Solu Let E1 be the event that six occurs.
.
Let E2 be the event that six does not occur.
Then,
P(E1) = 1/6
P(E2) = 5/6
Let A be the event that the man reports six occurs
Therefore, P(A/E1) that the man speaks the truth
And, P(A/E2) that the man does not speak the truth will be 1/4 [1-
(3/4)]

ATQ,
we need to find P(E1/A) that a six actually occured, given that the
man reported that six.
REAL WORLD APPLICATION

Bayes’ theorem is a fundamental concept in probability


theory and statistics that relates conditional
probabilities. It has numerous applications across
various fields. Here are some notable applications of
Bayes’ theorem:

Medical Diagnosis Spam Filtering

Fraud Detection Search Engines

Weather Forecasting Machine Learning

Fault Diagnosis Stock Market Analysis


Advantage
1
Provides a flexible framework for updating beliefs or
hypotheses based on new evidence

Provides a rational framework for decision making under


2 uncertainty.

Is well-suited for personalized inference, where individual


3 characteristics or previous experiences can be incorporated
into the analysis

In fields such as machine learning and statistics, Bayes'


4
theorem is used for model updating.

4 Bayes' theorem is widely used for prediction and forecasting


tasks
Limitation

Bayesian In situations where One of the key For two centuries,


results can be reliable prior criticisms of the theorem
greatly affected information is Bayes theorem remained unused
due to the
by the initial lacking or uncertain, is the extensive
guesses we Bayesian inference subjectivity computational
make about the may produce biased involved in resources needed
probabilities. or misleading
specifying prior to carry out its
results. calculations.
distributions.
CASE STUDY
In this application, Bayes' Theorem is used to assess the subsequent probability that a person has depression or whether

they do not have it. It uses, as a basis the prior probability of information about the diffusion of this pathology as well as

any information of the sensitivity and specificity values of the scores of psychological tests taken by this person.

Let us assume that a psychological test is used to diagnose the presence of depression in individuals. In this case the

probability of a person really being depressed, means her/his score exceeds the cut-off value and is not equivalent to

the probability to get scores greater than the cut-of when she/he is really depressed. When carrying out a diagnosis to

evaluate the actual risk of failure, one must consider the conditional probability of an individual being depressed when

she/he exceeds the cut-off value, and the conditional probability of exceeding the cut-off when she/he is depressed.

So if A is the probability of an individual being depressed and B the probability to exceeding the cut-off, then using

Bayes' theorem: P(A/B) =[P(B[A)P(A)]/P(B)] -------Equation 1

P(B|A), is also known as the conditional probability of surpassing the cut-off when the individual is really depressed

whereas P(A) relates to the percentage of people who are actually depressed in the population and P(B) is the

probability of getting a test score greater than the cut-off.


That is to say, if the percentage of depressed people in the USA population is 6.7% (source:
https://www.nimh.nih.gov/health/statistics/major-depression.shtml) then P(A)=0.067 To estimate P(B), the
conditional probability that an individual surpasses the cut-off, (if he/she is actually depressed) can be estimated
from the test scores. Assuming the probability that a person surpasses the cut off,
P(B|A)=0.8| P(B)=0.10 (taking into account that P(A)=0.067)
then the conditional probability that an individual is really depressed if he/she overcomes the cut-off is:
[P(B[A)P(A)]/P(B)= (0.8*0.067)/0.10-0.54.
That is the probability a person is really depressed is only 54%.
Conclusion
Bayes' Theorem provides a framework for rigorous and evidence-based

reasoning, allowing for incremental updates of beliefs. It is a valuable tool

for tackling complex problems, promoting logical reasoning and critical

thinking in decision-making processes. Understanding Bayes' Theorem

fosters a continuous learning mindset, emphasizing the importance of

incorporating new information in decision making

You might also like