Welcome to the first module, busy 581 statistics Format Managers.
The
topic of this module is what's the story? And this is we're just going
to try to provide the context of statistics for managers. The learning
objectives for this module are that at the end of this module, you
should be able to recognize the importance of statistics and
management. And also question the presentation of statistics, meaning,
what does statistics, what does statistical information tell you? And
to be a little bit skeptical about it, as you look at it. Statistics
are a critical resource for decision-making that we use them every day
in, in management. It's very important that you have a critical
understanding of what it means and what it doesn't mean. You can't
avoid statistics. It's everywhere in life and particularly in
business. Managers need an understanding of statistics in order to
interpret the statistical information they seek out or are given. It's
not necessarily or usually the manager that actually does the
statistical procedures or even collects the data. Usually someone else
does that. But managers are then presented statistical information in
some form, charts or graphs, or just statistical statements that
you'll need to understand in order to make your decisions. It and to
understand the situation, you need to understand the underlying
assumptions, the methods that were used to collect this information
and to process it, and all the practices upon which that information
is based. And it's important as well for managers to understand what
statistics can't tell us and to be, to be an aware consumer of
statistics and that is understand where the problems are and where to
ask questions. This course will cover the basic statistical operations
and procedures, but it will also focus on the critical evaluation of
statistical information. We're going to have to do the math that we're
going to do that very, very introductory level. And then we're going
to talk about what it means to you as well. Statistics is not always
the truth. If you talk to a statistician, they can, they will tell you
statistics itself really is, is very, very precise. But what the
statistics means is something that can't be easily interpret, easily
stated because there are different interpretations for the same
statistical operations. Was a quote by Wieland that I have here on the
slide that says, even in the best of circumstances, statistical
analysis rarely unveils the truth. So the present when you have the
math itself, it can be very precise, but when you then interpret it,
what does it mean? How you present it and make no mistake the way you
present your statistical information, it is designed to convey already
a certain message. So when you see statistical information in a chart,
or in a graph or in, in text, it's often already spun to have a
message that's specific for you to understand. So it leaves very, it
leads you down a certain path of interpretation already in the way
that it's presented. And that means that the way we present
statistical information, you can present the same statistical
information in many different ways. And that means that it's
presentation interpretation is imprecise and imperfect. And so what we
say is buyer beware. The language and visuals used in the presentation
of statistics is critically important and we're going to talk a lot
about that in this course. The meaning or truth of statistical
information is highly imprecise. Meaning, as I said before, the, the
math behind it is very precise, but what the statistics actually means
to you, or what you can infer from it, is very, very subjective. And
so there are certain things that you need to know about statistical
information in order to understand what it really means. The first
thing is the characteristics and context of the data. That is, where
did it come from? Who who does this represent? What is it trying to
represent? How was it collected? When was it collected? Where was it
collected and why was it collected? This also includes the sources and
sampling and that is how did this data gets collected from? Where was
it collected? How, if you're including people in your sample or if
you're interviewing people, how did you find them? How did you talk to
them? How did you get them to agree? And exactly what did you ask
them? The process of summarizing the data is also important and that
is particularly in terms of the choices that people make, in terms of
what to use as a summary and how to analyze the data so you make some
decisions. How am I going to analyze this data? And how is the, what
is the best way to summarize this data? And by the time you've made
that decision, you've already impacted the meaning of the information
that comes out the other side. It's very subjective. And the
inferences that we make from the results of the processing, meaning,
once everything is done and we're presented with this statistical
information, what can it tell us? And as importantly. Can't tell us,
so we need to be as aware of what we don't know from this information
as what we do. No presentation of the results and inferences is the
way that the results of inferences is communicated is very important
as well. So in this case, the medium is the message. The way that
chart is designed, the way the statistical way the words are formed,
all of those things will impact how we can interpret the information.
This leads us to the idea of cognitive biases in statistics. When you
look at statistics and the median, I'm, we're going to go through some
in the next lecture, in the next video. But the idea is that the
presentation of people that know how to present statistics, they know
our cognitive biases and the statistical information is presented to
you in a way to best trigger the interpretation that they want you to
have. And so we're going to talk a little bit about the cognitive
biases that affect our interpretation of statistical information we
see. Because understanding our own biases can help us to mitigate
those effects. So if you know that you're gonna have a bias towards a
certain thing, you can check yourself and say, Okay, hang on a second.
I getting this. Am I interpreting this right? Because or am I, Am I
making a shortcut in my in my interpretation because of some bias that
I have. So we're going to go over a few cognitive biases that are
particular pertinent to statistics. First, as a survivorship bias. And
that is about, that is about focusing on a group that has made it
through a selection process but ignores the failed your group. This is
typical when we do, For example, you've probably seen studies about
successful leaders and what, and what, what you need to do to be a
successful leader. But when you look at who they interviewed, who they
looked at, they were looking at only successful leaders. So what
you're interpreting from this kind of study is basically what do all
six, what did the sample of successful leaders have in common? But
what it leaves out is, what do, what do failed leaders have in common
that made them failures? Because that is as important as knowing what,
what, what successful leaders have in common. Another typical example
of this is when we study business strategy. And we ought we look at
successful companies because it's impossible really to look at
companies that in-depth IT, companies that have already failed. And so
we tend to be looking at successful companies only. And then the same
bias, we only know about those that were successful. We don't really
understand why the ones that failed, failed. There was a very famous
example you can find on Google when they were studying the damage of
aircraft and World War II to understand where to reinforce them
because they're thinking, Okay, what can we do to modify the design of
aircraft to make them harder to shoot down. And so every aircraft that
came back damaged. They studied it to see where the bullet hole was
and they reinforce those areas. The problem was that this did not help
the situation. How to make an airplane less likely to be shot down.
The reason was that they were only studying those aircraft that
received damage that was not fatal to the aircraft. So all the
aircraft that actually got shot down didn't make it back to be
studied. They were actually able, they were, what they were actually
doing was reinforcing and minimizing the damage to airplanes that
would have survived probably anyway. Another example is the idea that
the statistically true that orchestra conductors have a longer life,
average life expectancy than the average person. And the reason for
that is that an orchestra conductor becomes a conductor usually much
later in life. And so you're only, your target population is only
those people that are already over 60 years old. And that bias is your
data because it's only looking at the life expectancy of people that
are 60 and older, which is already going to be on the old side. Then
there's false causality bias. And the idea there is that the
assumption is that if two things are related or one follows the other
than the MST, one must cause the other. When in reality there might be
one factor causing both. Or that the relationship that you're assuming
actually might be in the other direction. A famous example is from New
York, where the researcher found a relationship between crime rates
and the presence of street vendors, particularly selling ice cream. So
in other words, there, they found a correlation between high crime
rates and the presence of street vendors. And the idea was what, wait
a minute, there's something about the street vendors, particularly the
one selling ice cream, that are causing these crimes to happen. And
the truth was, there was an underlying common cause which was summer
weather in the summer, there are more street vendors, particularly
more ice cream vendors, and in the summer there's more people out and
so crime rates tend to go up as well. Another example is if you were
to study firefighters, how many firefighters attend to fire and,
versus the size of the fire. You might notice that when the tire is
really big, there's more firefighters at the scene. And so one thing
you might say as well, if there's more firefighters that causes the
fire to be larger when in fact the relationship is the opposite. The
larger the fire, the more firefighters that attend the scene. Another
one is the affirming consequence fallacy. And that is the idea that if
the consequent is true than the antecedent must also be true. In other
words, if a happens, then B must happen. Then you know that B is true,
then you're assuming that a must also be true. Let me give you an
example. If it's raining, can look outside and see my street is wet.
So the next time I look outside and I see the street is wet. Can I
assume it's raining? Obviously, you don't assume it's raining. And if
you did, that would be the affirming consequence fallacy. And we do do
that. And I've got these listed here because these are common
fallacies that we have. If I consume caffeine, I have trouble
sleeping. The other night I had trouble sleeping. I must that that
coffee I ordered must not have been decaf. This is very, very common
that we do that, but in fact is not necessarily true. We have
absolutely no evidence to say that I was served inadvertently served a
caffeinated coffee just because I had trouble sleeping. The base rate
fallacy is, is about relying on individuating information. That is
thinking about a specific example, somebody you know, rather than the
base rate, that is the actual statistical information. For example, we
all know that when we hear about something happening, we perceive it
as much more prevalent than it is. We all did that during the COVID
pandemic. As soon as we knew someone that had COVID, it seemed more
real to us and much more likely to happen the same. It's the same
thing we get with car accidents, things like that. Another example is
if someone, you know, wins the lottery, believe it or not, you're more
likely to go buy a lottery ticket because the base rate fallacy will
make you believe that you are in fact also more likely to win the
lottery. Your friend winning the lottery makes no difference on your
likelihood of winning the lottery. But as humans, we have this bias
that makes us think we will. Other statistical related biases. Things
like we often ignore labels and focus on the images. So for example,
these are very common. I'm going to go through a few in the next
video. Relative size on a chart. Lot of people that presents
statistical information, I use area or volume, implied volume on a
chart, making something look twice as big, we process that as
automatically making it look, seem twice as likely. So if something is
half a size, we think it's half as likely when it isn't necessarily
so. And the other thing is slope. We think that the steeper the graph,
the greater the increase, decrease. And the truth is that it all
depends on, on the axes that they're using, the 0 line, all kinds of
ways. And we build the graph that can affect the steepness of the
graph. And I have a couple of examples to show you in the next
lecture. So just some examples of the perception, the critical
evaluation of statistics out there. The famous Mark Twain quote, there
are three kinds of lies. Lies, **** lies and statistics. And a book
that was put out called How to Lie with Statistics, a bestselling
book. So to recap, the idea of this course in particular is to get you
not to be as susceptible to the lies and statistics. To be an aware
buyer and informed buyer of statistical information as a manager and
to understand the mathematical and the subjective processes in behind
the production of statistical information so that you can understand
what it means for you and ask the right questions. So to recap,
statistics, there's a critical tool for managers and organizations.
The meaning or truth of statistical information is based on many
factors which we're going to go through in the future lectures.
Cognitive biases affect how we interpret statistical information. And
people that presents statistical information to us know that and take
advantage of those biases. And understanding the basics of statistics
and our own biases can help us critically evaluate statistical
information.