0% found this document useful (0 votes)
48 views9 pages

Decision Making for Uncertainty

This document discusses decision making under uncertainty. It defines risk and uncertainty, and notes that risk can be measured with probabilities while uncertainty involves unknown outcomes. The steps in decision making under uncertainty are listed as: 1) listing alternatives, 2) possible outcomes, 3) evaluating chances of outcomes, and 4) preferences for outcomes. Probability is defined mathematically between 0 and 1, and a probability distribution lists all possible outcomes and their probabilities. Expected value is the average value of outcomes if a decision was repeated many times.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views9 pages

Decision Making for Uncertainty

This document discusses decision making under uncertainty. It defines risk and uncertainty, and notes that risk can be measured with probabilities while uncertainty involves unknown outcomes. The steps in decision making under uncertainty are listed as: 1) listing alternatives, 2) possible outcomes, 3) evaluating chances of outcomes, and 4) preferences for outcomes. Probability is defined mathematically between 0 and 1, and a probability distribution lists all possible outcomes and their probabilities. Expected value is the average value of outcomes if a decision was repeated many times.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 9

DECISION MAKING

UNDER UNCERTAINTY
DIFFERENCE BETWEEN RISK AND
UNCERTAINTY
BASIS FOR COMPARISON RISK UNCERTAINTY
Meaning The probability of winning Uncertainty implies a
or losing something situation where the future
worthy is known as risk. events are not known.
Ascertainment It can be measured It cannot be measured.
Outcome Chances of outcomes The outcome is unknown.
are known.
Control Controllable Uncontrollable
Minimization Yes No
Probabilities Assigned Not assigned
STEPS IN DECISION MAKING UNDER
UNCERTAINTY

1 2 3 4

List available List the possible Evaluate the Decide how well
alternatives, not outcomes (these chances that the decision
only direct action will depend on
but also for any uncertain maker likes
chance events as
gathering well as on the outcome will each possible
information on decision maker’s occur outcome
which to base later actions)
action
PROBABILITY

01 02 03
Described as the The odds or chance that Ranging between 0 and
mathematical language the outcome will occur 1 (an event having a
of uncertainty probability of 1 is
certainty; an event
having a probability of 0
is deemed impossible
PROBABILITY DISTRIBUTION

A listing of the possible outcomes concerning an unknown


event and their respective probabilities

Table or graph showing all possible outcomes/payoffs for a


decision & the probability each outcome will occur
• For example, flip 3 coins at same time. Let random
variable X be the number of heads showing.

PROBABILITY DISTRIBUTION
x 0 1 2 3

P (X = x) 1/8 3/8 3/8 1/8

X = Random Value
Random Value – a variable whose value is determined by a
random experiment
EXPECTED VALUE

01 02 03
The expected value Does not give actual Indicates “average”
(EV) is an value of the random value of the
anticipated value for outcome outcomes if the risky
a given investment decision were to be
at some point in the repeated a large
future. number of times
EXPECTED VALUE

• The expected value is


calculated by multiplying
each of the possible
outcomes by the likelihood
each outcome will occur,
and summing all of those
values.
• For example, a normal six-sided dice. Once you roll the
dice, it has an equal one-sixth chance of landing on
one, two, three, four, five or six. Given this information,
the calculation is straightforward:

E (X) = σ 𝑥1 ∗ 𝑝1 + 𝑥2 ∗ 𝑝2 + 𝑥3 ∗ 𝑝3 + (𝑥𝑛 … ∗ 𝑝𝑛 … )
= (wherein x = probability and p = possible outcome)
= (1/6 * 1) + (1/6 * 2) + (1/6 * 3) + (1/6 * 4) + (1/6 * 5) + (1/6 * 6)
= 3.5

• If you were to roll a six-sided dice an infinite amount of


times, you see the average value equals 3.5.

You might also like