VISVESVARAYA TECHNOLOGICAL UNIVERSITY
JNANA SANGAMA, BELAGAVI – 590 018
Report on
Probability theory
Submitted in partial fulfilment of the requirements for the award of degree of
Bachelor of Engineering
BIO-MEDICAL ENGINEERING
Submitted by
Bharat Yogesh Barhanpurkar
(1RG22BM400)
Guided by
Dr. Manjunath G
Asso. Professor
Department of Mathematics, RGIT
Rajiv Gandhi Institute of Technology
Cholanagar, R.T. Nagar Post, Bengaluru-560032
2023-2024
RAJIV GANDHI INSTITUTE OF TECHNOLOGY
(Affiliated to Visvesvaraya Technological University, Belgaum)
Cholanagar, R. T. Nagar Post, Bengaluru-560032
DEPARTMENT OF BIOMEDICAL ENGINEERING
CERTIFICATE
This is to certify that the reported title “Statistical Methods” has been
successfully carried out by Bharat Yogesh Barhanpurkar (1RG22BM400),
Rajiv Gandhi Institute of Technology, in partial fulfilment forward of Degree,
Bachelor of Engineering in Biomedical Engineering under Visvesvaraya
Technological University, Belagavi, during the year from 2022 to 2023. It is
certified that all corrections/suggestions given for Internal Assessment have
been incorporated in the report. This report has been approved as it satisfies the
academic requirements in respect of project work (21MAT41) prescribed for
the said degree.
______________ ______________ ____________
Dr. Manjunath G Mrs. JANANEE J Dr. D G
ANAND
Associate Professor, Head of Department, Principal,
Department of BME, Department of BME, RGIT, Bengaluru.
RGIT, Bengaluru. RGIT, Bengaluru.
VISVESVARAYA TECHNOLOGICAL UNIVERSITY
JNANA SANGAMA, BELAGAVI – 590 018
RAJIV GANDHI INSTITUTE OF TECHNOLOGY
(Affiliated to Visvesvaraya Technological University, Belgaum)
Cholanagar, R. T. Nagar Post, Bengaluru-560032
DEPARTMENT OF BIOMEDICAL ENGINEERING
DECLARATION CERTIFICATE
I, Bharat Yogesh Barhanpurkar (1RG22BM400) studying in the third
semester of Bachelor of Engineering in Biomedical Engineering at Rajiv
Gandhi Institute of Technology, Bengaluru, hereby declare that my internship
work titled “Biomedical Engineer Intern” which is being submitted in the
partial fulfilment for the award of the Degree of Bachelor of Engineering in
Biomedical Engineering, from Visvesvaraya Technological University,
Belagavi is an authentic record of me carried out during the academic year
2022-2023, under the guidance of Dr. Manjunath G, Associate Professor,
Dept. Mathematics, Rajiv Gandhi Institute of Technology, Bengaluru.
I further undertake that the matter embodied in the dissertation has not been
submitted previously for the award of any degree or diploma by us to any other
university or institution.
ACKNOWLEDGEMENT
I would like to express my heartfelt gratitude to the following individuals for
their contributions and support during my Report on “Statistical Methods”.
I would like to express my gratitude for VTU, Belagavi for having the
internship project as a part of its curriculum, which gives us a wonderful
opportunity to review on current research topic and Rajiv Gandhi Institute of
Technology for providing me with the facilities, without which this project
could not have been successful.
We would like to express our sincere gratitude to Dr D. G. Anand, Principal of
RGIT and Dr. Manjunath G, Department of Mathematics Department, RGIT
for providing us this opportunity.
At the outset we would like to make a special mention to Dr. Manjunath G
Report Guide, Bio-Medical Engineering, Rajiv Gandhi Institute of Technology
for guiding us technically to be more competent to provide their continuous
support, advice and guidance.
I would also like to extend my sincere thanks to Dr. Manjunath G for
providing valuable guidance and insights into the role of a biomedical engineer.
I am grateful to each and every one of them for their unwavering support and
guidance throughout my internship period. Their contributions have been
pivotal in shaping my learning experience, and I am grateful for their valuable
insights.
Introduction to Probability theory
We often hear such statements: 'It is likely to rain today", *1 have a fair chance
of getting admission", and There is an even chance that in tossing a coin the
head may come up\ I n each case, we are not certain of the outcome, but we
wish to assess the chances of our predictions coming true. The study of
probability provides a mathematical framework for such assertions and is
essential in every decision-making process. Before defining probability* let us
explain a few terms:
Principle of counting. If an event can happen in ways and thereafter For each of
these events a second event can happen in n< 2 ways, and for each of these first
and second events a third event can happen for ways and so on, then the number
of ways these m event can happen is given by the product n x . n 2 . n 3 n m.
Permutations- A permutation of a number of objects is their arrangement in
some definite order. Given three letters a r h f c, we can permute them two at a
time as fetch yielding 6 permutations. The combinations or groupings are only
3, i.e., etc, ca t ah . Here the order Is imperial.
The number of permutations of different thing taken rat a time is
n (n - 1) (n - 2) …………... (n - r + 1), which is denoted by n P r
Thus P(n(n-1)(n-2)………….(n-m)
Permutations with repetitions. The number of permutations of n objects of
which n, are alike
Combinations. The number of combinations of. different objects taken r at a
time is denoted by n C r. If we take any one of the combinations, its r objects
can be arranged in r ways. So, the total number of arrangements which can be
obtained from all the combinations is n P r = n C r.
Basic terminology
(i) Exhaustive events, A set of events is said to be exhaustive, if it includes all
the possible events. For example, in tossing a coin there are two exhaustive
cases either head or tail and there is no third possibility.
(ii) Mutually exclusive events- If the occurrence of one of the events precludes
the occurrence of all other,
then such a set of events is said to be mutually exclusive. Just as tossing a coin,
either head comes up or the tail and both can’t happen at the same time, i.e.,
these are two mutually exclusive cases.
(iii) Equally likely events. If one of the events cannot be expected to happen in
preference to another then
such events are said to be equally likely. For instance, in tossing a coin, the
coming of the head or the tail is equally likely.
Thus, when a die * is thrown, the turning up of the sbc different faces of the die
are exhaustive, mutually exclusive and equally likely,
(iv) Odds in favour of an event. If the number of ways favourable to an event. A
is m and the number of ways not favourable to-A is n then odds in favour of A =
m/n and odds against A = n/m.
Definition of probability If there are n exhaustive mutually exclusive and
equally likely eases of which m are favourable to an event A, then probability
(p) of the happening of A is
P(A) = m/n.
As there are n/m cases in which A will not happen (denoted by A') f the chance
of A not happening is q or P (A) so that if an event is certain to happen then its
probability is unity w while if it is certain not to happen, its probability is zero.
Statistical (or Empirical) definition of probability. If in n trials, an event A
happens m times, then the probability (p) of happening of A is given by
Probability and set of notations
(1) Random experiment. Experiments which are performed essentially under the
same conditions and whose results cannot be predicted are known as random
experiments, e.g.. Tossing a coin or rolling a die are random experiments.
(2) Sample space. The set of all possible outcomes of a random experiment is
called sample space for that experiment and is denoted by S.
The elements of the sample space S are called the sample paints, e g., On
tossing a coin, the possible outcomes are the head (H) and the tail IT). Thus S =
{H, T}
(3) Event. The outcome of a random experiment is called an event . Thus every
subset of a sample space S is an event.
The null set <fi is also an event and is called an impossible event. Probability of
an impossible event is zero i,e. f P (i|>) = 0.
(4) Axioms
(i) The numerical value of probability lies between 0 and l. i.e., fur any event A
of S, 0 < P (A) < 1.
(it) The sum of probabilities of all sample events is unity i.e.. P (S) = I.
(iii) Probability of an event made of two or more sample events is the sum of
their probabilities.
(5) Notations
(i) Probability of happening of events A or B is written as P (A + B) or P (A u
B),
(ii) Probability of happening fj/both the events A and B is written as P (AB) or
P (A n B).
(iii) ‘Event A implies (=>) event B' is expressed as A c5.
iiv) 'Events A and B are mutually exclusive’ is expressed as A n B ~ 0-
(6) For any two events A and B,
P {A r B')=P (A) - P(AnB)
Proof. From Fig. 26.1,
(A n B)
P[(A n S') u (A n B)] = P (A) or P (A i"i B') + F (A n B) = P (A)
P(An B‘) = P (A) P (A n B >
Similarly. P (A' n B) = P (B) - P (A n B)
(1) If the probability of an event A happening as a result of a trial is Pi A) and
the probability of a mutually exclusive event B happening is P(B ), then the
probability of either of the events happening as a result of the trial
is P(A + Bl or P{A u B) = P(A) + P(B).
Proof. Let n be the total number of equally likely cases and let be favourable to
the event A and m 2 be favourable to the event B. Then the number of cases
favourable to A or B is m, + tn. r Hence the probability of A orB happening as a
result of the trial
P= El + = P(A) + PU3).
(2) If A, B, are any two events (not mutually exclusive), then
P (A + B) = P (A) + P <B) - P (AB) or P(AuB) = P(A) + P(B)-P(AnB)
If the events A and B are any two events then, there are some outcomes which
favour both A and B. If m a be their number, then these are included in both m ]
and r«, r Hence the total number of outcomes favouring either A or B or both is
m | + m 3 - m 3.
Thus the probability of occurrence of A or B or both.
Hence P(A + B) = P(A) + P(B) - P(A U B)
or P(A u B) = P{A ) + F(B) - P(A n B)
3) If A, B t C are any three events, then
P (A + B + C) P (A) + P (B) + P (C )-P (AB) - P {BO - P <CA) + P (ABC), or
P (A + B) = P (A) + P <B) - P (AB) or P(AuB) = P(A) + P(B)-P(AnB)
Proof Using the above result for any two events, we have P(C)
= P (A uB) + P (C) - P M o 8) ri Cl
= \P (A) + P (B) - P (A n B)] + P (C) - P l(A r, C) u (B n C>] (Distributive Law)
= P (A) + P (B) + P (C) - P (A n B) - {P (A n C) + P{B nC)~P{A^B n C)1
(AnC) = JC) = P (A) +P (B) + P (O - P (A n B)- P{B rC) - P(C rA) + P (A n B)
(A n C)
Independent Event
Two events are said to be independent, if happening or failure of one does not
affect the happening or failure of the other. Otherwise the events are said to be
dependent.
For two dependent events A and B, the symbol P (A) denotes the probability of
occurrence of B, when A has already occurred. It is known as the conditional
probability and is read as a 'probability of B given A’.
Multiplication law of probability or Theorem of compound probability. If the
probability of an event A happening as a result of trial is PLA) and after A has
happened the probability of an event B happening us a result of another trial ii.
conditional probability of B given A) is P(B/A), then the probability of both the
events A and B happening as a result of two trials is P (AB) or
P(A n B) = P(A), P(BIA).
Proof. Let n be the total number of outcomes in the first trial and m be
favourable to the event A so that P(A) = min.
Let n , be the total number of outcomes in the second trial of which met are
favourable to the event B so that P(B/A) .
Now each of the n outcomes can be associated with each of the n , outcomes. So
the total number of outcomes in the combined trial is nn i . Of these mm, are
favourable to both the events A and B. Hence
P(AB) or P(A n B)= = P(A). P(B/A).
Similarly, the conditional probability of A given B is P(A/B).
P(AB) or P(A/B ) = P{B). P(A/B )
Thus P(A/B ) – P(A). P(B/A > = P(B). P(A/B).
(3) If the events A and B are independent, i.e., if the happening of B does not
depend on whether A has happened or not, then P(B/A) = P(B) and P(A/B)
P(A).
P(AB) or P (A n B) = P(A) + P(B),
In general. P(A) or PA, m A1 ... r'l A n) = f J (A), P(A 2 )..,.TA).
Probability is an estimation of one independent variable in terms of
the other. If x & y are correlated, the best fitting straight line in the
least square sense gives reasonably a good relation between x & y.
The best fitting straight line of the form y = ax + b (x being the
independent variable) is called the regression line of y on x & x = ay
+ b (y being the independent variable) is called the regression line of
x on y.
Formulas for line of regression
Let y = ax + b be the equation of the regression line of y on x for a
given set of n values (x,y).
Then y- 𝑦̅= 𝑟 𝜎𝑦 𝜎𝑥 (x-𝑥̅)………….(1)
This is the regression of y on x.
Similarly, x- 𝑥̅= 𝑟 𝜎𝑥 𝜎𝑦 (y-𝑦̅)………..(2)
This is the regression line of x on y.
given by r 𝜎𝑦 𝜎𝑥 and r 𝜎𝑥 𝜎𝑦 are known as the regression
The coefficient of x in (1) & the coefficient of y in (2) respectively
coefficients.
Their product is equal to 𝑟 2.
Thus, we can conclude that r is the geometric mean (GM) of the
regression coefficients since the GM of two numbers a, b is √𝑎𝑏.
That is r = ±√ (𝑐𝑜𝑒𝑓𝑓. 𝑜𝑓 𝑥)(𝑐𝑜𝑒𝑓𝑓. 𝑜𝑓 𝑦)
The sign of r will be positive or negative according as the regression
coefficients are positive or negative.
This form will be useful to find out the coefficient of correction by
first obtaining the lines of regression as we have deduced that as
r = ±√ (𝑐𝑜𝑒𝑓𝑓. 𝑜𝑓 𝑥)(𝑐𝑜𝑒𝑓𝑓. 𝑜𝑓 𝑦)
If a real variable X be associated with the outcome of a random experiment, then
since the values which X takes depend rm chance, it is called □ random variable or
a stochastic variable or simply a variate. For instance, if a random experiment E
consists of tossing a pair of dice, the sum A' of the two numbers which turn up have
the value 2 ? 3, 4 S ...* 12 depending on chance. Then A'is the random variable. II
is a function whose values are real numbers and depend on chance.
If in a random experiment, the event corresponding to a number a occurs, then the
corresponding random variable A" is said to assume the value a and the probability
of the event is denoted by PiX = a). Similarly, the probability of the event A"
assuming any value in the interval a < X < b is denoted by Piu < X < b K The
probability of the event X < c is written as
If a random variable takes a finite set of values, it is called a discrete variate. On the
other hand, if it assumes an infinite number of uncountable values, it is called a
continuous variate.
Suppose a discrete variate A is the outcome of some experiment. If the probability
that A' takes the values x is j1 then
P (A = x.) = Pj or j(p) for i — 1, 2,,.. where (r)p(x p ) > 0 for ail values of x
The set of values x i with their probabilities p f constitute a discrete probability
distribution of the discrete variate X.
For example, the discrete probability distribution for A. the sunt of the numbers
which turn on tossing a pair of dice is given by the following table:
There are 6 x 6 = 36 equally likely outcomes and therefore, each has F(x)i the
probability 1/36. We have A' = 2 for one outcome, i.e. f 1, l); A = 3 for two outcomes
(1,2) ant! (2, 1); A” = 4 for three outcomes U, 3), (2, 2) ant! (3, 1) and so on.
(2) Distribution function. The distribution function F ix) of the discrete Variate X is
defined by
F(x) = l 3 l X £ x) = y p (r) where x is any integer. The graph of Fix) will be
i=t
stair step form (Fig. 26.2). The distribution function is also sometimes called
cumulative distribution function.
When a variate A' takes every value in an interval, it. gives rise to continuous
distribution of X. The distributions defined by the variates like heights or weights are
continuous distributions.
A major conceptual difference, however, exists between discrete and continuous
probabilities. When thinking in discrete terms, the probability associated with an event
is meaningful. With continuous events, however, where the number of events is
infinitely large, the probability that a specific event will occur is practically zero. For
this reason, continuous probability statements must be worded somewhat differently
from discrete ones. Instead of finding the probability that a- equals some value, wo
find the probability of x falling in a small interval.
Thus, the probability distribution of a continuous variate x is defined by a function
f(x) such that the
probability nr the variate x falling in the small interval x is f(x) dx. Symbolically it can
be
The range of the variable may be finite or infinite. But even when the range is finite, it
is convenient to consider it as infinite by supposing the density function to be zero
outside the given range. Thus, if fix) — 4>(_t) be the density function denoted for the
variate x in the interval (a, ft), then it can be written as
The density function, fix) is always positive and j / (x I dx =j) ii.eg. the total area
under the probability
curve and the x-axis is unity which corresponds to the requirements that the total
probability of happening of an event is unity 1.
(2) Distribution function
then F far) is defined us the cumulative distribution function or simply the distribution
function of the (nutritious/R variate X. It is the probability that the value of the variate
A' will be the graph in this case is as shown in Fig. 26.3.
Hence the function f(x) satisfies the requirements for a density function.
(ii) Required probability = P(J)= 0,368 —0,135 = 0.233.
This probability is equal to the shaded area in Fig. 26.3 (a).
(iii) Cumulative probability function