The Impact of Cognitive Biases On Professionals' Decision-Making: A Review of Four Occupational Areas
The Impact of Cognitive Biases On Professionals' Decision-Making: A Review of Four Occupational Areas
Université de Lorraine, 2LPN, Nancy, France, 2Psychology and Neuroscience Lab, Centre d’Économie de la Sorbonne,
1
The author reviewed the research on the impact of cognitive biases on professionals’
decision-making in four occupational areas (management, finance, medicine, and law).
Two main findings emerged. First, the literature reviewed shows that a dozen of cognitive
biases has an impact on professionals’ decisions in these four areas, overconfidence
being the most recurrent bias. Second, the level of evidence supporting the claim that
cognitive biases impact professional decision-making differs across the areas covered.
Research in finance relied primarily upon secondary data while research in medicine and
law relied mainly upon primary data from vignette studies (both levels of evidence are
found in management). Two research gaps are highlighted. The first one is a potential
lack of ecological validity of the findings from vignette studies, which are numerous. The
Edited by:
Sergio Da Silva, second is the neglect of individual differences in cognitive biases, which might lead to the
Federal University of Santa Catarina, false idea that all professionals are susceptible to biases, to the same extent. To address
Brazil
that issue, we suggest that reliable, specific measures of cognitive biases need to
Reviewed by:
Silvia Lopes,
be improved or developed.
Universidade de Lisboa, Portugal
Keywords: decision-making, cognitive biases, heuristics, management, finance, medicine, law
Marcia Zindel,
University of Brasilia, Brazil
*Correspondence:
Vincent Berthet INTRODUCTION
vincent.berthet@univ-lorraine.fr
When making judgments or decisions, people often rely on simplified information processing
Specialty section: strategies called heuristics, which may result in systematic, predictable errors called cognitive
This article was submitted to biases (hereafter CB). For instance, people tend to overestimate the accuracy of their judgments
Personality and Social Psychology, (overconfidence bias), to perceive events as being more predictable once they have occurred
a section of the journal (hindsight bias), or to seek and interpret evidence in ways that are partial to existing beliefs
Frontiers in Psychology
and expectations (confirmation bias). In fact, the seminal work of Kahneman and Tversky on
Received: 26 October 2021 judgment and decision-making in the 1970s opened up a vast research program on how
Accepted: 03 December 2021 decision-making deviates from normative standards (e.g., Tversky and Kahneman, 1974; Kahneman
Published: 04 January 2022
et al., 1982; Gilovich et al., 2002).
Citation: The “heuristics and biases” program has been remarkably fruitful, leading to unveiling
Berthet V (2022) The Impact of
dozens of CB and heuristics in decision-making (e.g., Baron, 2008, listed 53 such biases).
Cognitive Biases on Professionals’
Decision-Making: A Review of Four
While this research turned out to have a large impact in the academic field and beyond
Occupational Areas. (Kahneman, 2011), it is worth noting that it led to some debate (Vranas, 2000; Pohl, 2017).
Front. Psychol. 12:802439. In particular, Gigerenzer (1991, 1996) (Gigerenzer et al., 2008) outlined that Kahneman and
doi: 10.3389/fpsyg.2021.802439 Tversky relied upon a narrow view of normative rules (probability theory), leading them to
ask participants to make artificial judgments (e.g., estimating decision. These judgments are erroneous in respect to normative
the probability of single events) likely to result in so-called assumption that “information that is available only after decision
“errors.” Gigerenzer also pointed out the overemphasis on is made is irrelevant to the quality of the decision” (Baron
decision errors and the lack of theory behind the heuristics- and Hershey, 1988, p. 569).
and-biases approach, which eventually results in a list of cognitive Overconfidence bias is a common inclination of people to
errors with no theoretical framework. However, there have overestimate their own abilities to successfully perform a
been several attempts to overcome this shortcoming, such as particular task (Brenner et al., 1996).
the reframing of the heuristics-and-biases literature in terms Relative risk bias is a stronger inclination to choose a
of the concept of attribute substitution (Kahneman and Frederick, particular treatment when presented with the relative risk than
2002) and the various taxonomies of CB advanced based on when presented with the same information described in terms
dual-process models (e.g., Stanovich et al., 2008). of the absolute risk (Forrow et al., 1992).
While early research on CB was conducted on lay participants Susceptibility to framing is the tendency for people to react
to investigate decision-making in general, there has been a differently to a single choice depending on whether it is
large interest in how such biases may impede professional presented as a loss or a gain (Tversky and Kahneman, 1981).
decision-making in areas, such as management (e.g., Maule In the present paper, we review the research on the impact
and Hodgkinson, 2002), finance (e.g., Baker and Nofsinger, of CB on professional decision-making in four areas: management,
2002), medicine (e.g., Blumenthal-Barby and Krieger, 2015), finance, medicine, and law. Those applied areas were selected
and law (e.g., Rachlinski, 2018). Consider, for example, the as they have led to the highest number of publications on
framing effect, when making risky decisions, people prefer this topic so far (see “Materials and Methods”). This study
sure gains over more risky ones, whereas they prefer risky aims to address the following research questions:
losses over sure ones (Kahneman and Tversky, 1979). Therefore,
framing a problem in terms of gains versus losses can significantly 1. Assess the claim that CB impact professionals’ decision-making
impact decision-making. In most lawsuits for instance, plaintiffs 2. Assess the level of evidence reported in the empirical studies
choose between a sure gain (the settlement payment) and a 3. Identify the research gaps
potential larger gain (in the case of further litigation) while We take a narrative approach to synthesizing the key
defendants choose between a sure loss (the settlement payment) publications and representative empirical studies to answer
and a potential larger loss (in the case of further litigation). these research questions. To the best of our knowledge, this
In fact, when considering whether the parties should settle study is the first literature review on this research topic covering
the case, judges evaluating the case from the plaintiff ’s perspective multiple areas together. This review is narrative, as opposed
are more likely to recommend settlement than those evaluating to a systematic review, which is one of its limitations. However,
the case from the defendant’s perspective (Guthrie et al., 2001). it aims to be useful both to researchers and professionals
Likewise, when asking to rate the effectiveness of a drug, working in the areas covered.
presenting the results of a hypothetical clinical trial in terms The present paper is structured as follows. The Methods
of absolute survival (gain), absolute mortality (loss), or relative section provides details about the methodology used to conduct
mortality reduction (gain) influences the ratings of doctors the literature review. In the following sections, we review the
(Perneger and Agoritsas, 2011). key findings in each of the four occupational areas covered.
For the sake of convenience, we list below the common Finally, in the Discussion section, we answer the three research
definition of the main CB considered in this review. questions addressed in light of the findings reviewed.
Anchoring Bias is the tendency to adjust our judgments
(especially numerical judgments) toward the first piece of
information (Tversky and Kahneman, 1974). MATERIALS AND METHODS
Availability bias is the tendency by which a person evaluates
the probability of events by the ease with which relevant We conducted a systematic literature search using the Web of
instances come to mind (Tversky and Kahneman, 1973). Science (WoS) database with the search terms “cognitive biases
Confirmation bias is the tendency to search for, to interpret, AND decision making.” The search criteria included research
to favor, and to recall information that confirms or supports articles, review articles, or book chapters with no restriction
one’s prior personal beliefs (Nickerson, 1998). regarding the time period. We focused on the WoS database
Disposition effect is the tendency among investors to sell as the “Web of Science Categories” filter would offer a practical
stock market winners too soon and hold on to losers too mean to select the applied areas covered. Admittedly, the results
long (Shefrin and Statman, 1985). This tendency is typically of our review might have been different had we covered more
related to loss aversion (Kahneman and Tversky, 1979). databases; however, as our strategy was to review the key
Hindsight bias is a propensity to perceive events as being publications and representative empirical studies in each of
more predictable, once they have occurred (Fischhoff, 1975). the areas selected, we reasoned that virtually every database
Omission bias is the preference for harm caused by omissions would have led to these records.
over equal or lesser harm caused by acts (Baron and Ritov, 2004). The PRISMA flowchart in Figure 1 illustrates the process
Outcome bias is the tendency to judge the quality of a of article search and selection in this study. The WoS search
decision based on the information about the outcome of that led to a total of 3,169 records. Before screening, we used
that the issue of individual biases in strategic decision-making conservatism, law of small numbers, regression bias, wishful
is of limited relevance as strategic decisions are the product thinking, illusion of control, logical reconstruction, and
of organizations rather than individuals within the context hindsight bias).
of a wider sociopolitical arena (Mintzberg, 1983; Johnson, In a similar vein, Das and Teng (1999) proposed a framework
1987). However, individual (micro) factors might help explain to explore the presence of four basic types of CB (prior
organizational (macro) phenomena, an idea promoted by hypotheses and focusing on limited targets, exposure to limited
behavioral strategy (Powell et al., 2011). alternatives, insensitivity to outcome probabilities, and illusion
The “heuristics and biases” program revived the interest for of manageability) under five different modes of decision-making
bounded rationality in management with the idea that decision- (rational, avoidance, logical incrementalist, political, and garbage
makers may use heuristics to cope with complex and uncertain can). They proposed that not all basic types of biases are
environments, which in turn may result in inappropriate or robust across all kinds of decision processes; rather, their
suboptimal decisions (e.g., Barnes, 1984; Bazerman, 1998). selective presence is contingent upon the specific processes
Indeed, it is relatively easy to see how biases, such as availability, that decision makers engage in. For instance, the garbage can
hindsight, or overconfidence, might play out in the strategic mode (Cohen et al., 1972) depicts decision-making processes
decision-making process. For instance, it may seem difficult as organized anarchies, in which a decision is largely dependent
in hindsight to understand why IBM and Kodak failed to see on chance and timing. In this kind of process, decision makers
the potential that Haloid saw (which led to the Xerox company). do not know their objectives ex ante, but merely look around
The hindsight bias can actually lead managers to distort their for decisions to make. Das and Teng (1999) hypothesized that
evaluations of initial decisions and their predictions (Bukszar managers under the garbage can mode will be exposed to
and Connolly, 1988). Likewise, practicing auditors of major limited alternatives and insensitive to outcome probabilities.
accounting firms are sensitive to anchoring effects (Joyce and On the contrary, managers under the rational mode would
Biddle, 1981) and prospective entrepreneurs tend to neglect be exposed to prior hypotheses and illusion of manageability.
base rates for business failures (Moore et al., 2007). This framework, however, is not supported by rigorous
To our knowledge, no systematic review of empirical research empirical evidence.
on the impact of heuristics and CB on strategic decision- It is not difficult to list examples of poor strategic decisions
making has been published to date. Whereas the idea that that can be readily interpreted – in hindsight – as the result
CB could affect strategic decisions is widely recognized, the of heuristics and biases. However, the claim that CB influence
corresponding empirical evidence is quite weak. Most research strategic decisions requires to be tested more directly through
on this topic consists in narrative papers relying upon laboratory research and experimental studies (Maule and
documentary sources and anecdotal evidence (e.g., Duhaime Hodgkinson, 2002). It is worth noting that such research is
and Schwenk, 1985; Lyles and Thomas, 1988; Huff and Schwenk, scarce, probably because of its lack of ecological validity, an
1990; Zajac and Bazerman, 1991; Bazerman and Moore, 2008). issue of primary importance in the field of management research
In fact, the typical paper describes a few CB and provides (Schwenk, 1982). Still, two CB in particular have been studied
for each one examples of how a particular bias can lead to quantitatively, the framing effect and CEO overconfidence.
poor strategic decisions (see Barnes, 1984, for a representative Hodgkinson et al. (1999) used an experimental setting to
example). While the examples provided are often compelling, investigate the effect of framing on strategic decisions. Following
such research faces severe methodological limitations. the “Asian Disease” problem (Tversky and Kahneman, 1981),
The work of Schwenk (1984, 1985) is representative of they presented subjects (undergraduate management students)
that type of research. This author identified three different with a 500-word case vignette giving a brief history of a
stages of the strategic decision process (goal formulation and company that manufactured and distributed fast paint-drying
problem identification, strategic alternatives generation, systems. A positive and a negative frame were used and
evaluation of alternatives, and selection of the best one) and participants were asked to adopt the role of a board member
a set of heuristics and biases that might affect decisions at facing a major strategic decision and to indicate which of
each stage. Schwenk also provided for each bias an illustrative two alternative options they would choose. The positive frame
example of how the bias may impede the overall quality of emphasized gains from a reference point of no profit, whereas
strategic decisions. For example, the representativeness heuristics the negative frame highlighted losses from a reference point
may affect the stage of evaluation and selection of the where the target profit is achieved (£3 million). In addition,
alternatives. To illustrate this, Schwenk mentioned the head participants were either asked to choose between the presented
of an American retail organization (Montgomery Ward) who options directly or to represent the ways in which they thought
held a strong belief that there would be a depression at the about the problem in the form of a causal map prior to
end of the Second World War as was the case after World making their choice. It turned out that when participants made
War I. Based on this belief, this executive decided not to their decisions directly, a massive framing effect was found
allow his company to expand to meet competition from his (45.5% of participants chose the risk-averse option in the
rival (Sears), which led to a permanent loss of market share positive frame versus 9% in the negative frame). However, no
to Sears. Schwenk (1988) listed ten heuristics and biases of framing effect was observed when participants were asked to
potential key significance in the context of strategic decision- draw a causal map before making their choice (36.4% of the
making (availability, selective perception, illusory correlation, participants opted for the risk-averse option in both versions).
individual rather than professional investors (e.g., mutual funds, hedge funds,
and their press portrayal), Malmendier and Tate (2008) provided pension funds, and investment advisors). Findings suggest that institutional
support for that hypothesis: the odds of making an acquisition investors are prone to various CB but to a lesser extent than individual investors
are 65% higher if the CEO is classified as overconfident. (e.g., Kaustia et al., 2008).
of a stock, investors report confidence intervals that are too a general description of a couple of CB and then describe
narrow compared to the actual variability of prices (e.g., how these shortcuts can lead physicians to make poor decisions,
De Bondt, 1998). Investors also overestimate their ability to such as wrong diagnoses (e.g., Dawson and Arkes, 1987; Elstein,
beat the market. Baker and Nofsinger (2002) reported a finding 1999; Redelmeier, 2005). But narrative reviews provide limited
of a Gallup survey in 2001 revealing that on average, investors evidence. As Zwaan et al. (2017, p. 105) outlined, “While
estimated that the stock market return during the next 12 months these papers make a formidable argument that the biases
would be 10.3% while estimating that their portfolio return described in the literature might cause a diagnostic error,
would be 11.7%. Barber and Odean (2001) reported evidence empirical evidence that any of these biases actually causes
that overconfidence in investors is related to gender. Based diagnostic errors is sparse.”
on a sample of 35,000 individual accounts over a six-year On the other hand, studies that investigated the actual
period, their findings showed that males exhibit more impact of CB on medical decisions are mainly experimental
overconfidence regarding their investing abilities and also trade studies using written cases (hypothetical vignettes) designed
more often than females. Overconfidence in investors makes to elicit a particular bias. A typical example of vignette study
them more prone to take high risks (Chuang and Lee, 2006) is that of Mamede et al. (2010) on the effect of availability
and trade too much (Odean, 1999; Statman et al., 2006; Glaser bias on diagnostic accuracy. In a first phase, participants (first-
and Weber, 2007), which results in poor financial performance year and second-year internal medicine residents) were provided
(consequent transaction costs and losses). For instance, trading with 6 different cases and they were asked to rate the likelihood
turnover and portfolio returns are negatively correlated: of that the indicated diagnosis was correct (all cases were based
66,465 households with accounts at a large discount broker on real patients with a confirmed diagnosis). Then, participants
during 1991–1996, households that trade most had an annual were asked to diagnose 8 new cases as quickly as possible,
return of 11.4% while the average annual return was 16.4% that is, relying on non-analytical reasoning. Half of those new
(Barber and Odean, 2000). cases were similar to the cases encountered in phase 1, so
On the other hand, the disposition effect is the tendency that the availability bias was expected to reduce diagnostic
by which investors tend to sell winning stocks too early while accuracy for those four cases. Second-year residents had actually
holding on to losing positions for too long (Shefrin and Statman, lower diagnostic accuracy on cases similar to those encountered
1985). Based on trading records for 10,000 accounts at a large in phase 1 as compared to other cases, as they provided the
discount brokerage house, Odean (1998) reported that on phase 1 diagnosis more frequently for phase 2 cases they had
average, winning investments are 50% more likely to be sold previously encountered than for those they had not.
than losing investment (similar results were obtained in other While vignette-based studies are the most frequent, researchers
countries, such as France; Boolell-Gunesh et al., 2009). The in this area have used diverse strategies (Blumenthal-Barby
disposition effect originates in loss aversion described by prospect and Krieger, 2015). For instance, Crowley et al. (2013) developed
theory (Kahneman and Tversky, 1979). a computer-based method to detect heuristics and biases in
diagnostic reasoning as pathologists examine virtual slide cases.
Each heuristic or bias is defined as a particular sequence of
MEDICINE hypothesis, findings, and diagnosis formulation in the diagnostic
reasoning interface (e.g., availability bias is considered to occur
The idea that cognitive failures are a primary source of medical if in a sequence of three cases where the third case has a
errors has become prevalent in the medical literature (e.g., different diagnosis than the two previous ones, the participant
Detmer et al., 1978; Dawson and Arkes, 1987; Schmitt and makes an incorrect diagnosis in the third case such that the
Elstein, 1988; Elstein, 1999; Croskerry, 2003; Klein, 2005). In diagnosis is identical to the correct diagnosis in the two
fact, emergency medicine has been described as a “natural immediately preceding cases). Such a procedure allows for
laboratory of error” (Bogner, 1994). Among medical errors, examining the relationships between heuristics and biases, and
diagnostic errors have received particular attention (Graber, diagnostic errors.
2013). Indeed, there is increasing evidence that mental shortcuts Another methodology consists in reviewing instances where
during information processing contribute to diagnostic errors errors occurred, to which CB presumably contributed (e.g.,
(e.g., Schnapp et al., 2018). Graber et al., 2005). However, studies following this methodology
It is not difficult to see how CB may impact medical decisions. are vulnerable to hindsight bias: since reviewers are aware
Blumenthal-Barby and Krieger (2015) provided the following that an error was committed, they are prone to identify biases
examples. A parent might refuse to vaccinate her child after ex post (Wears and Nemeth, 2007). The fact that bias can
she sees a media report of a child who developed autism after be in the eye of the beholder has been supported by Zwaan
being vaccinated (availability bias). A patient with atrial et al. (2017) who asked 37 physicians to read eight cases and
fibrillation might refuse to take warfarin because she is concerned list which CB were present from a list provided. In half the
about causing a hemorrhagic stroke despite greater risk of cases, the outcome implied a correct diagnosis; in the other
having an ischemic stroke if she does not take warfarin (omission half, it implied an incorrect diagnosis. Physicians identified
bias). Indeed, early papers on this topic were primarily narrative more biases when the case outcome implied an incorrect
reviews suggesting a possible impact of CB on medical decision- diagnosis (3.45 on average) than when it implied a correct
making. These papers follow the same logic: they first provide one (1.75 on average).
To date, two systematic reviews have been published on the how such biases could intervene during the hearing process
impact of CB on medical decision-making. Reviewing a total of (confirmation bias and hindsight bias), ruling (inability to ignore
213 studies, Blumenthal-Barby and Krieger (2015) reported the inadmissible evidence), and sentencing (anchoring effects). In
following findings: (1) 77% of the studies (N = 164) were based fact, research suggests that judges, prosecutors, and other
on hypothetical vignettes; (2) 34% of studies (N = 73) investigated professionals in the legal field might rely on heuristics to produce
medical personnel; (3) 82% of the studies (N = 175) were conducted their decisions, which leaves room for CB (e.g., Guthrie et al.,
with representative populations; (4) 68% of the studies (N = 145 2007; Helm et al., 2016; Rachlinski and Wistrich, 2017).2
studies) confirmed a bias or heuristic in the study population; Researchers investigating judges’ decision-making have mainly
(5) the most studied CB are loss/gain framing bias (72 studies, relied upon archival studies (document analyses of court records)
24.08%), omission bias (18 studies, 6.02%), relative risk bias (29 and experimental studies in which judges are asked to decide
studies, 9.70%), and availability bias (22 studies, 7.36%); (6) the on hypothetical cases. In archival studies, researchers examine
results regarding loss/gain framing bias are mixed with 39% of if judges’ decisions in actual cases exhibit features of irrationality.
studies (N = 28) confirming an effect, 39% (N = 28) confirming For instance, Ebbesen and Konecni (1975) investigated which
an effect only in a subpopulation, and 22% (N = 16) disconfirming information felony court judges considered when deciding the
any effect; (7) 25 of 29 studies (86%) supported the impact of amount of bail to set. When presented with fictitious cases,
relative risk bias on medical decisions; and (8) 14 of 18 studies the judges’ decisions were influenced by relevant information,
(78%) supported the impact of omission bias on medical decisions. such as prior criminal record, but their actual bail decisions
Saposnik et al. (2016) conducted a similar review but including relied almost exclusively on prosecutorial recommendations.
only 20 studies. These authors reported that as: (1) 60% of That is, judges seem to be (too) heavily affected by prosecutors’
the studies (N = 12) targeted CB in diagnostic tasks; (2) framing recommendations. Another example of archival study is the
effect (N = 5) and overconfidence (N = 5) were the most common infamous research of Danziger et al. (2011) who highlighted
CB while tolerance to risk or ambiguity was the most commonly a cycle in repeated judicial rulings: judges are initially lenient,
studied personality trait (N = 5); and (3) given that the large then progressively rule more in favor of the status quo over
majority of the studies (85%) targeted only one or two biases, time, and become lenient again after a food break. This would
the true prevalence of CB influencing medical decisions remains suggest that psychological factors, such as mental fatigue, could
unknown. Moreover, there was a wide variability in the reported influence legal decisions (but see Weinshall-Margel and Shapard,
prevalence of CB. For example, when analyzing the three most 2011). Archival studies, however, are limited by the difficulty
comprehensive studies that accounted for several CB (Ogdie to control for unobserved variables.
et al., 2012; Stiegler and Ruskin, 2012; Crowley et al., 2013), On the other hand, vignette studies consist in presenting
it turned out that the availability bias ranged from 7.8 to 75.6% judges with hypothetical scenarios simulating real legal cases.
and anchoring bias from 5.9 to 87.8%; (4) the presence of CB As in the medical field, researchers have primarily relied on
was associated with diagnostic inaccuracies in 36.5 to 77% of such studies. A representative study is that of Guthrie et al.
case-scenarios. Physicians’ overconfidence, anchoring effect, and (2001) who administered a survey to 167 federal magistrate judges
information or availability bias may be associated with diagnostic in order to assess the impact of five CB (anchoring, framing,
inaccuracies; (5) only seven studies (35%) provided information hindsight bias, inverse fallacy, and egocentric bias) on their
to evaluate the association between physicians’ CB and therapeutic decisions regarding litigation problems (see Guthrie et al., 2002,
or management errors. Five of these studies (71.4%) showed for a summary of the research). Using materials adapting classic
an association between CB (anchoring, information bias, cognitive problems into legal ones, Guthrie et al. (2001) reported
overconfidence, premature closure, representativeness, and that judges fell prey to these biases but to various extent. For
confirmation bias) and therapeutic or management errors. instance, in order to assess whether judges were susceptible to
hindsight bias, Guthrie et al. (2001) presented them with a
hypothetical case in which the plaintiff appealed the district
JUSTICE court’s decision and asked them to indicate which of three possible
outcomes of the appeal was most likely to have occurred. Crucially,
Based on the legal realism’ premise that “judges are human,” they also provided them with the actual outcome of the court
the recent years have seen a growing interest for judicial decision- of appeals. The outcome significantly influenced judges’ assessments:
making (e.g., Klein and Mitchell, 2010; Dhami and Belton, 2017;
Rachlinski, 2018). This topic covers issues, such as cognitive Interestingly, the notion of cognitive bias might also shed light on certain
2
models of judicial decision-making (e.g., the story model), the rules of law. For example, Guthrie et al. (2001) presented judges with a problem
based on the classic English case Byrne v. Boadle (1863) asked them to assess
impact of extralegal factors on decisions, prejudice (e.g., gender the likelihood that a warehouse was negligent for an accident involving a
bias and racial bias), moral judgments, group decision-making, barrel that injured a bystander. The materials indicated that when the warehouse
or the comparison of lay and professional judges. It is worth is careful, accidents occur one time in 1,000, but that when the warehouse is
noting that most research on judicial decision-making has focused negligent, accidents occur 90% of the time. The materials also indicated that
on how jurors decide cases, relying on jury simulations (MacCoun, the defendant is negligent only 1% of the time. Judges overestimated the
probability that the defendant was negligent, failing to consider the base rate
1989). Here, we focus on how professional judges might be prone of negligence. Interestingly, this logical fallacy is implemented in the doctrine
to CB. One might easily consider how CB could hamper judicial (res ipsa loquitur), which instructs judges to take no account of the base rates
decisions. In a narrative fashion, Peer and Gamliel (2013) reviewed (Kaye, 1979).
those informed of a particular outcome were more likely to occurs when the sentencing demand is determined randomly
have identified that outcome as the most likely to have occurred. (the result of a dice throw). Interestingly, Englich et al. (2005)
In particular, numerous studies have investigated the impact found that the defense’s sentencing recommendation is actually
of anchoring effects on judicial decisions (see Bystranowski anchored on the prosecutor’s demand, so that the former mediates
et al., 2021, for a recent meta-analysis). Judges and jurors are the impact of the latter on the judge’s decision. Therefore, while
often required to translate qualitative judgments into quantitative it is supposed to be at their advantage, the fact that defense
decisions (Hans and Reyna, 2011; Rachlinski et al., 2015). attorneys present their sentencing recommendation after the
While their qualitative judgments on matters, such as the prosecution might be a hidden disadvantage for the defense.
severity of the plaintiff ’s injury or the appropriate severity of Along with anchoring, the impact of hindsight bias in the
punishment, show a high degree of consistence and predictability courtroom has been also well documented, mainly in liability
(Wissler et al., 1999), a great amount of variability appears cases (Harley, 2007; Oeberst and Goeckenjan, 2016). When
(especially for non-economic and punitive damages) when these determining liability or negligence, judges and juries must assess
qualitative judgments are translated into numbers (e.g., civil whether the defendant is liable for a negative outcome (damage
damage awards and criminal sentences; Hart et al., 1997; or injury). The difficulty is that jurors accomplish this task in
Diamond et al., 1998). This might be explained by the fact retrospect: having knowledge of the outcome, jurors tend to
that numerical assessments can be prone to anchoring. Facing perceive it as foreseeable and accordingly rate highly the negligence
uncertainty about the amount to determine, judges and especially or liability of the defendant (Rachlinski et al., 2011). To avoid
juries (due to their lack of experience and information about this bias, the law requires jurors to ignore the outcome information
standard practice) tend to rely on any numerical point of while evaluating the extent to which it should have been foreseen
reference and make their judgment through adjustments from by the defendant. However, research suggests that jurors tend
that number. As these adjustments are often insufficient, the to fall prey to hindsight bias as much as lay persons. When
judgments are biased toward the anchor (see Kahneman et al., evaluating the precautions took by a municipality to protect a
1998, for a model describing how individual jurors set punitive riparian property owner from flood damage, participants assessing
damages and the role of anchoring in that process). the situation in foresight concluded that a flood was too unlikely
Accordingly, numerical values, such as a damage cap (e.g., to justify further precautions. However, participants assessing
Hinsz and Indahl, 1995; Robbennolt and Studebaker, 1999), the situation in hindsight considered that such a decision was
the amount of damages claimed by the plaintiff (Chapman negligent and also gave higher estimates for the probability of
and Bornstein, 1996), the amount of economic damage (Eisenberg the disaster occurring (Kamin and Rachlinski, 1995).
et al., 1997, 2006), the sentence imposed in the preceding Outcome information has been shown to affect jurors’
case, a sentence urged by the prosecutor, or a sentence decisions about punitive damage awards (Hastie et al., 1999)
recommended by a probation officer, might act as anchors in and their decisions about the legality of a search (Casper et al.,
the courtroom, moving the judges’ decisions toward them. 1989). In addition, more severe outcomes tend to produce a
Guthrie et al. (2001) reported that in a personal injury suit, larger hindsight bias, a result particularly stressed in medical
an irrelevant factor, such as a number in a pre-trial motion malpractice litigation (LaBine and LaBine, 1996). While the
(used to determine whether the damages met the minimum assessment of negligence of the accused physician should
limit for federal court), could act as an anchor. They presented be based on his course of action regardless of the outcome,
judges with a description of a serious personal injury suit in jurors are highly influenced by the severity of a negative medical
which only damages were at issue and asked them to estimate outcome when determining negligence in medical malpractice
how much they would award the plaintiff in compensatory cases (Berlin and Hendrix, 1998). Cheney et al. (1989) reviewed
damages. Prior to this estimation, half of the judges were asked 1,004 cases of anesthesia-related negligence and reported that
to rule on a pre-trial motion filed by the defendant to have the court had imposed liability on the defendant in over 40
the case dismissed for failing to meet the jurisdictional minimum percent of the cases, even though the physician acted appropriately.
in a diversity suit ($75,000). It turned out that the judges who There is also significant evidence that confirmation bias
were asked only to determine the damage award provided an (Nickerson, 1998) may impact professional judges’ decisions.
average estimate of $1,249,000 while the judges who first ruled In the legal field, confirmation bias has been primarily studied
on the motion provided an average estimate of $882,000. with regard to criminal investigations (Findley and Scott, 2006).
Enough and Mussweiler (2001) conducted a series of research Once they become convinced that the suspect is guilty,
on how recommendations anchor judicial decisions, even when professionals involved in criminal proceedings (e.g., police officers
they are misleading. In their 2001 paper, they showed that and judges) may engage in guilt-confirming investigation endeavors
sentencing decisions tend to follow the sentence demanded by (or tunnel vision) by which they undermine alternative scenarios
the prosecutor. When told that the prosecutor recommended in which the suspect is actually innocent. Several studies reported
a sentence of 34 months, criminal trial judges recommended evidence of confirmation bias in criminal cases. For instance,
on average 8 months longer in prison (M = 24.41 months) than O’Brien (2009) found that participants (College students) who
when told that the sentence should be 12 months (M = 17.64) articulated a hypothesis regarding the suspect early in their
for the same crime. This anchoring effect was independent of review of a mock police file showed bias in seeking and
the perceived relevance of the sentencing demand, and judges’ interpreting evidence to favor that hypothesis, thereby
experience. Englich et al. (2006) reported that anchoring even demonstrating a case-building mentality against a chosen suspect.
TABLE 1 | Summary of the main cognitive biases studied in the fields of Similarly, Lidén et al. (2019) showed that judges’ detentions of
management, finance, medicine, and law, and corresponding evidence.
suspects trigger a confirmation bias that influences their assessment
Studies included in the review
of guilt and that this bias is affected by who decided about
detention. In fact, judges perceived the detained defendants’
Management Barnes (1984) (multiple, narrative); Ben-David et al. (2013) statements as less trustworthy and were also more likely to
(overconfidence, empirical); Bukszar and Connolly (1988) (hindsight convict when they themselves had previously detained the suspect
bias, empirical); Das and Teng (1999) (multiple, theoretical); Duhaime
as compared to when a colleague had decided to detain.3
and Schwenk (1985) (multiple, theoretical); Hodgkinson et al. (1999)
(framing effect, empirical); Huff and Schwenk (1990) (multiple,
Table 1 provides a summary of the main CB in the four
theoretical); Joyce and Biddle (1981) (anchoring effect, empirical); occupational areas reviewed and the corresponding evidence.
Lyles and Thomas (1988) (multiple, theoretical); Malmendier and Tate
(2005) (overconfidence, empirical); Malmendier and Tate (2008)
(overconfidence, empirical); Maule and Hodgkinson (2002) (multiple,
narrative); Moore et al. (2007) (overconfidence, empirical); Schwenk
DISCUSSION
(1988) (multiple, theoretical); Schwenk (1984) (multiple, narrative);
Schwenk (1985) (multiple, narrative); Zajac and Bazerman (1991) The goal of the present paper was to provide an overview of
(blind spot bias, theoretical) the impact of CB on professional decision-making in various
Finance Baker and Nofsinger (2002) (multiple, review); Barber and Odean occupational areas (management, finance, medicine, and law).
(2000) (overconfidence, empirical); Barber and Odean (2001)
In all of them, there has been tremendous interest in that issue
(overconfidence, empirical); Benartzi and Thaler (1995) (loss
aversion, empirical); Boolell-Gunesh et al. (2009) (disposition as revealed by a vast amount of research. Our review provided
effect, empirical); Chuang and Lee (2006) (overconfidence, significant answers to the three research questions addressed.
empirical); Coval and Moskowitz (1999) (home bias, empirical); First, the literature reviewed shows that, overall, professionals
De Bondt and Thaler (1985) (regression to the mean, empirical); in the four areas covered are prone to CB. In management,
De Bondt (1998) (multiple, review); Glaser and Weber (2007)
(overconfidence, empirical); Grinblatt et al. (1995) (herding
there is evidence that risky-choice (loss/gain) framing effects
behavior, empirical); Kumar and Goyal (2015) (multiple, review); and overconfidence (among CEOs) impact decision-making. In
Odean (1998) (disposition effect, empirical); Odean (1999) finance, there is strong evidence that overconfidence and the
(overconfidence, empirical); Shefrin and Statman (1985) disposition effect (a consequence of loss aversion) impact
(disposition effect, empirical); Shiller (2003) (multiple, narrative); individual investors’ decision-making. Regarding medical decision-
Statman et al. (2006) (overconfidence, empirical)
Medicine Blumenthal-Barby and Krieger (2015) (multiple, review); Croskerry
making, the systematic review of Blumenthal-Barby and Krieger
(2003) (multiple, narrative); Crowley et al. (2013) (multiple, (2015) revealed that (1) 90% of the 213 studies reviewed
empirical); Dawson and Arkes (1987) (multiple, narrative); Detmer confirmed a bias or heuristic in the study population or in a
et al. (1978) (multiple, narrative); Elstein (1999) (multiple, subpopulation of the study; (2) there is strong evidence that
narrative); Graber et al. (2005) (multiple, empirical); Klein (2005)
omission bias, relative risk bias, and availability bias have an
(multiple, narrative); Mamede et al. (2010) (availability bias,
empirical); Ogdie et al. (2012) (multiple, empirical); Redelmeier
impact on medical decisions, and mixed evidence for the risky-
(2005) (multiple, narrative); Saposnik et al. (2016) (multiple, choice framing effect. On the other hand, the systematic review
review); Schmitt and Elstein (1988) (multiple, narrative); Schnapp of Saposnik et al. (2016) – based on 20 studies only – reported
et al. (2018) (multiple, empirical); Stiegler and Ruskin (2012) that physicians’ overconfidence, anchoring, and availability bias
(multiple, review); Wears and Nemeth (2007) (hindsight bias,
were associated with diagnostic errors. Finally, the effects of
narrative); Zwaan et al. (2017) (multiple, empirical)
Law Berlin and Hendrix (1998) (hindsight bias, narrative); anchoring, hindsight bias, and confirmation bias on judicial
Bystranowski et al. (2021) (anchoring effect, review); Casper decision-making are well documented. Overall, overconfidence
et al. (1989) (hindsight bias, empirical); Chapman and Bornstein appears as the most recurrent CB over the four areas covered.
(1996) (anchoring effect, empirical); Cheney et al. (1989) Second, the level of evidence supporting the claim that CB
(hindsight bias, empirical); Englich et al. (2005) (anchoring effect,
empirical); Englich et al. (2006) (anchoring effect, empirical);
impact professionals’ decision-making differs across the four
Enough and Mussweiler (2001) (anchoring effect, empirical); areas covered. In medicine and law, this issue has been primarily
Findley and Scott (2006) (confirmation bias, theoretical); Guthrie evidenced in vignettes studies. Such primary data provide a
et al. (2001) (multiple, empirical); Guthrie et al. (2007) (multiple, relevant assessment of CB in decision-making but they face
empirical); Guthrie et al. (2002) (multiple, narrative); Harley (2007) the issue of ecological validity (see below). Accordingly, a
(hindsight bias, review); Hastie et al. (1999) (anchoring effect,
empirical); Helm et al. (2016) (multiple, empirical); Hinsz and
mid-level of evidence can be assigned to these findings. On
Indahl (1995) (anchoring effect, empirical); Kamin and Rachlinski the other hand, following the method of revealed preference
(1995) (hindsight bias, empirical); LaBine and LaBine (1996) by which the preferences of individuals are uncovered through
(hindsight bias, empirical); Lidén et al. (2019) (confirmation bias, the analysis of their choices in real-life settings, the impact
empirical); O’Brien (2009) (confirmation bias, empirical); Oeberst
and Goeckenjan (2016) (hindsight bias, review); Peer and
Gamliel (2013) (multiple, narrative); Rachlinski and Wistrich Note that other CB, such as framing and omission bias, might also shed light
3
(2017) (multiple, narrative); Rachlinski (2018) (framing effect, on judicial decision-making (Rachlinski, 2018). In fact, judges decide cases
empirical); Rachlinski et al. (2011) (hindsight bias, empirical); differently depending on whether the underlying facts are presented as gains
Rachlinski et al. (2015) (anchoring effect, empirical); Robbennolt or losses (Rachlinski and Wistrich, 2018). Moreover, viewing the acceptance
and Studebaker (1999) (anchoring effect, empirical) of a claim as the path of action and its dismissal as the path of inaction,
omission bias might explain why the acceptance threshold of judges of a
After each study, we indicate in brackets the cognitive bias(es) addressed in the study plaintiff ’s claim is particularly high (Zamir and Ritov, 2012). However, those
and the article type (narrative, empirical, theoretical, and review). biases have been much less studied than anchoring and hindsight bias.
of CB on financial decision-making has been evidenced through and group comparisons (Gilovich et al., 2002). For instance,
secondary data (e.g., trading records), indicating a higher level based on the experimental result described above, one might
of evidence. In management, both levels of evidence are found wrongly infer that all judges are susceptible to anchoring, to
(framing effects were demonstrated in vignette studies while the same extent. That is why Guthrie et al. (2007, p. 28) clarified
CEO overconfidence was evidenced through secondary data). that “the fact that we generally observed statistically significant
A practical implication of these findings is the need for differences between the control group judges and experimental
professionals to consider concrete, practical ways of mitigating group judges does not mean that every judge made intuitive
the impact of CB on decision-making. In finance, this issue has decisions. […] Our results only show that, as a group, the
been tackled with programs that aimed to improve financial judges were heavily influenced by their intuition – they do
literacy (Lusardi and Mitchell, 2014). In medicine, debiasing has not tell us which judges were influenced and by how much.”
been considered as a way to reduce the effects of CB (Graber In fact, there is clear evidence for individual differences in
et al., 2002, 2012; Croskerry, 2003; Croskerry et al., 2013). In susceptibility to CB (e.g., Bruine de Bruin et al., 2007).
fact, recent research has reported evidence that the debiasing The issue of individual differences is of primary importance
of decisions can be effective (Morewedge et al., 2015; Sellier when considering CB in decision-making, especially among
et al., 2019). However, a preliminary step to considering practical professionals. In finance for example, the measurement of the
means of mitigating the impact of CB is to acknowledge this disposition effect at the individual level revealed significant
diagnosis. In fact, professionals are reluctant to accept the idea individual differences, 20% of investors showing no disposition
that their decisions may be biased (e.g., Kukucka et al., 2017). effect or a reverse effect (Talpsepp, 2011). Taking full account
Judges, for instance, tend to dismiss the evidence showing the of individual differences is crucial when considering public
impact of CB on judicial decisions, arguing that most studies interventions aiming to mitigate individual biases: any single
did not investigate decisions on real cases (Dhami and Belton, 2017). intervention might work on individuals highly susceptible to
Thirdly, our review highlights two major research gaps. The the bias addressed while having no or even harmful effects
first one is a potential lack of ecological validity of the findings on individuals moderately susceptible to it (Rachlinski, 2006).
from vignette studies, which are numerous (Blumenthal-Barby Addressing the issue of individual differences in bias susceptibility
and Krieger, 2015). Consider for instance a study designed to requires having standardized, reliable measures (Berthet, 2021).
test whether sentencing decisions could be anchored by certain While reliable measures of a dozen CB are currently available,
information, such as the sentence demanded by the prosecutor measures of key biases are still lacking (e.g., confirmation bias
(Enough and Mussweiler, 2001). A typical study consists in and availability bias). Most importantly, these measures are generic,
presenting judges with a vignette describing a hypothetical using non-contextualized items. Such measures are relevant for
criminal case and asking them to sentence the defendant (e.g., research with the purpose of describing general aspects of decision-
Rachlinski et al., 2015). If a statistically significant difference making (Parker and Fischhoff, 2005; Bruine de Bruin et al.,
is observed between the different anchor conditions, it is 2007). However, research on individual differences in professional
concluded that anchoring impacts judges’ sentencing decisions. decision-making requires specific measures which items are adapted
Does such a finding mean that judges’ sentencing decisions to the context in which a particular decision is made (e.g.,
in real cases are affected by anchoring too? Likewise, it has diagnostic decision and sentencing decision). An example is the
been reported that 90% of judges solve the Wason task incorrectly inventory of cognitive biases in medicine (Hershberger et al.,
(Rachlinski et al., 2013) but this does not imply per se that 1994) which aims to measure 10 CB in doctors (e.g., insensitivity
confirmation bias impedes judges’ decisions in their regular to prior probability and insensitivity to sample size) through 22
work. Addressing that issue requires to use more ecological medical scenarios. The development of such instruments in the
settings, such as mock trials in the case of judicial decision- context of management, finance, and law is an important avenue
making (Diamond, 1997). for future research on professional decision-making.
The second research gap is the neglect of individual differences
in CB. This limit was found in the four areas covered. Individual
differences have been neglected in decision-making research AUTHOR CONTRIBUTIONS
in general (Stanovich et al., 2011; Mohammed and Schwall,
2012). Indeed, most of the current knowledge about the impact The author confirms being the sole contributor of this work
of CB on decision-making relies upon experimental research and has approved it for publication.
Baron, J., and Hershey, J. C. (1988). Outcome bias in decision evaluation. Das, T. K., and Teng, B. (1999). Cognitive biases and strategic decision processes:
J. Pers. Soc. Psychol. 54, 569–579. doi: 10.1037/0022-3514.54.4.569 An integrative perspective. J. Manag. Stud. 36, 757–778. doi:
Baron, J., and Ritov, I. (2004). Omission bias, individual differences, and 10.1111/1467-6486.00157
normality. Organ. Behav. Hum. Decis. Process. 94, 74–85. doi: 10.1016/j. Dawson, N. V., and Arkes, H. R. (1987). Systematic errors in medical decision
obhdp.2004.03.003 making. J. Gen. Intern. Med. 2, 183–187. doi: 10.1007/BF02596149
Bazerman, M. H. (1998). Judgment in Managerial Decision Making. New York: De Bondt, W. F. (1998). A portrait of the individual investor. Eur. Econ. Rev.
Wiley. 42, 831–844. doi: 10.1016/S0014-2921(98)00009-9
Bazerman, M.H., and Moore, D. (2008). Judgment in Managerial Decision De Bondt, W. F. M., and Thaler, R. (1985). Does the stock market overreact?
Making. 7th Edn. Hoboken, NJ: Wiley. J. Financ. 40, 793–805. doi: 10.1111/j.1540-6261.1985.tb05004.x
Benartzi, S., and Thaler, R. H. (1995). Myopic loss aversion and the equity De Long, J. B., Shleifer, A., Summers, L. H., and Waldmann, R. J. (1990).
premium puzzle. Q. J. Econ. 110, 73–92. doi: 10.2307/2118511 Noise Trader Risk in Financial Markets. J. Polit. Econ. 98, 703–738.
Ben-David, I., Graham, J., and Harvey, C. (2013). Managerial miscalibration. Detmer, D. E., Fryback, D. G., and Gassner, K. (1978). Heuristics and biases
Q. J. Econ. 128, 1547–1584. doi: 10.1093/qje/qjt023 in medical decision-making. J. Med. Educ. 53, 682–683.
Berlin, L., and Hendrix, R. W. (1998). Perceptual errors and negligence. AJR Dhami, M. K., and Belton, I. K. (2017). On getting inside the judge’s mind.
Am. J. Roentgenol. 170, 863–867. doi: 10.2214/ajr.170.4.9530024 Trans. Issues Psychol. Sci. 3, 214–226. doi: 10.1037/tps0000115
Berthet, V. (2021). The measurement of individual differences in cognitive Diamond, S. S. (1997). Illuminations and shadows from jury simulations. Law
biases: A review and improvement. Front. Psychol. 12:630177. doi: 10.3389/ Hum. Behav. 21, 561–571. doi: 10.1023/A:1024831908377
fpsyg.2021.630177 Diamond, S. S., Saks, M. J., and Landsman, S. (1998). Jurors judgments about
Blumenthal-Barby, J. S., and Krieger, H. (2015). Cognitive biases and heuristics liability and damages: sources of variability and ways to increase consistency.
in medical decision making: A critical review using a systematic DePaul Law Rev. 48, 301–325.
search strategy. Med. Decis. Mak. 35, 539–557. doi: 10.1177/0272989X Duhaime, I. M., and Schwenk, C. R. (1985). Conjectures on cognitive simplification
14547740 in acquisition and divestment decision making. Acad. Manag. Rev. 10,
Bogner, M. S. (Ed.) (1994). Human Error in Medicine. Hillsdale, NJ: Lawrence 287–295. doi: 10.5465/amr.1985.4278207
Erlbaum Associates, Inc. Ebbesen, E. B., and Konecni, V. J. (1975). Decision making and information
Boolell-Gunesh, S., Broihanne, M., and Merli, M. (2009). Disposition effect, integration in the courts: The setting of bail. J. Pers. Soc. Psychol. 32, 805–821.
investor sophistication and taxes: Some French specificities. Finance 30, doi: 10.1037/0022-3514.32.5.805
51–78. doi: 10.3917/fina.301.0051 Eisenberg, T., Goerdt, J., Ostrom, B., Rottman, D., and Wells, M. T. (1997).
Brenner, L. A., Koehler, D. J., Liberman, V., and Tversky, A. (1996). Overconfidence The predictability of punitive damages. J. Leg. Stud. 26, 623–661. doi:
in probability and frequency judgments: A critical examination. Organ. 10.1086/468010
Behav. Hum. Decis. Process. 65, 212–219. doi: 10.1006/obhd.1996.0021 Eisenberg, T., Hannaford-Agor, P. L., Heise, M., LaFountain, N.,
Bruine de Bruin, W., Parker, A. M., and Fischhoff, B. (2007). Individual differences Munsterman, G. T., Ostrom, B., et al. (2006). Juries, judges, and punitive
in adult decision-making competence. J. Pers. Soc. Psychol. 92, 938–956. damages: empirical analyses using the civil justice survey of state courts
doi: 10.1037/0022-3514.92.5.938 1992, 1996, and 2001 data. J. Empir. Leg. Stud. 3, 263–295. doi: 10.1111/j.
Bukszar, E., and Connolly, T. (1988). Hindsight bias and strategic choice: Some 1740-1461.2006.00070.x
problems in learning from experience. Acad. Manag. J. 31, 628–641. Eisenhardt, K. M., and Zbaracki, M. J. (1992). Strategic decision making. Strateg.
Bystranowski, P., Janik, B., Próchnicki, M., and Skórska, P. (2021). Anchoring Manag. J. 13, 17–37. doi: 10.1002/smj.4250130904
effect in legal decision-making: A meta-analysis. Law Hum. Behav. 45, 1–23. Elstein, A. S. (1999). Heuristics and biases: selected errors in clinical reasoning.
doi: 10.1037/lhb0000438 Acad. Med. 74, 791–794. doi: 10.1097/00001888-199907000-00012
Casper, J. D., Benedict, K., and Perry, J. L. (1989). Juror decision making, Englich, B., Mussweiler, T., and Strack, F. (2005). The last word in court--A
attitudes, and the hindsight bias. Law Hum. Behav. 13, 291–310. doi: 10.1007/ hidden disadvantage for the defense. Law Hum. Behav. 29, 705–722. doi:
BF01067031 10.1007/s10979-005-8380-7
Chapman, G. B., and Bornstein, B. H. (1996). The more you ask for, the more Englich, B., Mussweiler, T., and Strack, F. (2006). Playing dice with criminal
you get: anchoring in personal injury verdicts. Appl. Cogn. Psychol. 10, sentences: the influence of irrelevant anchors on experts’ judicial decision
519–540. doi: 10.1002/(SICI)1099-0720(199612)10:6<519::AID- making. Personal. Soc. Psychol. Bull. 32, 188–200. doi:
ACP417>3.0.CO;2-5 10.1177/0146167205282152
Cheney, F. W., Posner, K., Caplan, R. A., and Ward, R. J. (1989). Standard of Enough, B., and Mussweiler, T. (2001). Sentencing Under uncertainty: anchoring
care and anesthesia liability. JAMA 261, 1599–1603. doi: 10.1001/ effects in the courtroom. J. Appl. Soc. Psychol. 31, 1535–1551. doi: 10.1111/
jama.1989.03420110075027 j.1559-1816.2001.tb02687.x
Chuang, W.-I., and Lee, B.-S. (2006). An empirical evaluation of the Findley, K. A., and Scott, M. S. (2006). The multiple dimensions of tunnel
overconfidence hypothesis. J. Bank. Financ. 30, 2489–2515. doi: 10.1016/j. vision in criminal cases. Wis. Law Rev. 2, 291–398.
jbankfin.2005.08.007 Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome
Cohen, M. D., March, J. G., and Olsen, J. P. (1972). A garbage can model of knowledge on judgment under uncertainty. J. Exp. Psychol. Hum. Percept.
organizational choice. Adm. Sci. Q. 17, 1–25. doi: 10.2307/2392088 Perform. 1, 288–299.
Coval, J. D., and Moskowitz, T. J. (1999). Home bias at home: local equity Forrow, L., Taylor, W. C., and Arnold, R. M. (1992). Absolutely relative: how
preference in domestic portfolios. J. Financ. 54, 2045–2073. doi: research results are summarized can affect treatment decisions. Am. J. Med.
10.1111/0022-1082.00181 92, 121–124. doi: 10.1016/0002-9343(92)90100-P
Croskerry, P. (2003). The importance of cognitive errors in diagnosis and Gigerenzer, G. (1991). “How to make cognitive illusions disappear: Beyond
strategies to minimize them. Acad. Med. 78, 775–780. doi: “heuristics and biases,”” in European Review of Social Psychology. Vol. 2
10.1097/00001888-200308000-00003 W. Stroebe and M. Hewstone (Eds.) (Chichester: Wiley), 83–115.
Croskerry, P., Singhal, G., and Mamede, S. (2013). Cognitive debiasing 1: origins Gigerenzer, G. (1996). On narrow norms and vague heuristics: A reply to
of bias and theory of debiasing. BMJ Qual. Saf., 22(Suppl 2), 58–64. Kahneman and Tversky. Psychol. Rev. 103, 592–596. doi: 10.1037/0033-295X.
doi: 10.1136/bmjqs-2012-001712 103.3.592
Crowley, R. S., Legowski, E., Medvedeva, O., Reitmeyer, K., Tseytlin, E., Gigerenzer, G., Hertwig, R., Hoffrage, U., and Sedlmeier, P. (2008). “Cognitive
Castine, M., et al. (2013). Automated detection of heuristics and biases illusions reconsidered,” in Handbook of Experimental Economics Results. eds.
among pathologists in a computer-based system. Adv. Health Sci. Educ. C. R. Plott and V. L. Smith (Amsterdam: Elsevier), 1018–1034. doi: 10.1016/
Theory Pract. 18, 343–363. doi: 10.1007/s10459-012-9374-z S1574-0722(07)00109-6
Danziger, S., Levav, J., and Avnaim-Pesso, L. (2011). Extraneous factors in Gilovich, T., Griffin, D., and Kahneman, D. (Eds.) (2002). Heuristics and
judicial decisions. Proc. Natl. Acad. Sci. 108, 6889–6892. doi: 10.1073/ Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge
pnas.1018033108 University Press.
Glaser, M., and Weber, M. (2007). Overconfidence and trading volume. Geneva Kahneman, D., and Tversky, A. (1979). Prospect theory: An analysis of decision
Risk Insur. Rev. 32, 1–36. doi: 10.1007/s10713-007-0003-3 under risk. Econometrica 47, 263–291. doi: 10.2307/1914185
Graber, M. L. (2013). The incidence of diagnostic error in medicine. BMJ Kamin, K. A., and Rachlinski, J. J. (1995). Ex post ≠ ex ante: determining
Qual. Saf., 22(Suppl 2), 21–27. doi:10.1136/bmjqs-2012-001615. liability in hindsight. Law Hum. Behav. 19, 89–104. doi: 10.1007/BF01499075
Graber, M. L., Franklin, N., and Gordon, R. (2005). Diagnostic error in internal Kaustia, M., Alho, E., and Puttonen, V. (2008). How much does expertise
medicine. Arch. Intern. Med. 165, 1493–1499. doi: 10.1001/archinte.165.13.1493 reduce behavioral biases? The case of anchoring effects in stock return
Graber, M., Gordon, R., and Franklin, N. (2002). Reducing diagnostic errors estimates. Financ. Manag. 37, 391–412. doi: 10.1111/j.1755-053X.2008.00018.x
in medicine: what’s the goal? Acad. Med. 77, 981–992. doi: Kaye, D. (1979). Probability theory meets res Ipsa loquitur. Mich. Law Rev.
10.1097/00001888-200210000-00009 77, 1456–1484. doi: 10.2307/1288109
Graber, M. L., Kissam, S., Payne, V. L., Meyer, A. N., Sorensen, A., Lenfestey, N., Klein, J. G. (2005). Five pitfalls in decisions about diagnosis and prescribing.
et al. (2012). Cognitive interventions to reduce diagnostic error: a narrative BMJ 330, 781–783. doi: 10.1136/bmj.330.7494.781
review. BMJ Qual. Saf. 21, 535–557. doi: 10.1136/bmjqs-2011-000149 Klein, D. E., and Mitchell, G. (Eds.) (2010). The Psychology of Judicial Decision
Grinblatt, M., Titman, S., and Wermers, R. (1995). Momentum investment Making. New York, NY: Oxford University Press.
strategies, portfolio performance, and herding: A study of mutual fund Kukucka, J., Kassin, S. M., Zapf, P. A., and Dror, I. E. (2017). Cognitive bias
behavior. Am. Econ. Rev. 85, 1088–1105. and blindness: A global survey of forensic science examiners. J. Appl. Res.
Guthrie, C., Rachlinski, J. J., and Wistrich, A. J. (2001). Inside the judicial Mem. Cogn. 6, 452–459. doi: 10.1016/j.jarmac.2017.09.001
mind. Cornell Law Rev. 86, 777–830. doi: 10.2139/ssrn.257634 Kumar, S., and Goyal, N. (2015). Behavioural biases in investment decision
Guthrie, C., Rachlinski, J. J., and Wistrich, A. J. (2002). Judging by heuristic: making – A systematic literature review. Qual. Res. Financ. Markets 7,
cognitive illusions in judicial decision making. Judicature 86, 44–50. 88–108. doi: 10.1108/QRFM-07-2014-0022
Guthrie, C., Rachlinski, J., and Wistrich, A. J. (2007). Blinking on the bench: LaBine, S. J., and LaBine, G. (1996). Determinations of negligence and the
how judges decide cases. Cornell Law Rev. 93, 1–43. hindsight bias. Law Hum. Behav. 20, 501–516. doi: 10.1007/BF01499038
Hans, V. P., and Reyna, V. F. (2011). To dollars from sense: qualitative to Lidén, M., Gräns, M., and Juslin, P. (2019). ‘Guilty, no doubt’: detention
quantitative translation in jury damage awards. J. Empir. Leg. Stud. 8, 120–147. provoking confirmation bias in judges’ guilt assessments and debiasing
doi: 10.1111/j.1740-1461.2011.01233.x techniques. Psychol. Crime Law 25, 219–247. doi: 10.1080/1068316X.
Hardman, D., and Harries, C. (2002). How rational are we? Psychologist 15, 2018.1511790
76–79. Lusardi, A., and Mitchell, O. S. (2014). The economic importance of financial
Harley, E. M. (2007). Hindsight bias in legal decision making. Soc. Cogn. 25, literacy: theory and evidence. J. Econ. Lit. 52, 5–44.
48–63. doi: 10.1521/soco.2007.25.1.48 Lyles, M. A., and Thomas, H. (1988). Strategic problem formulation: biases
Hart, A. J., Evans, D. L., Wissler, R. L., Feehan, J. W., and Saks, M. J. (1997). and assumptions embedded in alternative decision-making models. J. Manag.
Injuries, prior beliefs, and damage awards. Behav. Sci. Law 15, 63–82. doi: Stud. 25, 131–145. doi: 10.1111/j.1467-6486.1988.tb00028.x
10.1002/(SICI)1099-0798(199724)15:1<63::AID-BSL254>3.0.CO;2-9 MacCoun, R. J. (1989). Experimental research on jury decision-making. Science
Hastie, R., Schkade, D. A., and Payne, J. W. (1999). Juror judgments in civil 244, 1046–1050. doi: 10.1126/science.244.4908.1046
cases: effects of plaintiff ’s requests and plaintiff ’s identity on punitive damage Malmendier, U., and Tate, G. (2005). CEO overconfidence and corporate
awards. Law Hum. Behav. 23, 445–470. doi: 10.1023/A:1022312115561 investment. J. Financ. 60, 2661–2700. doi: 10.1111/j.1540-6261.2005.00813.x
Helm, R. K., Wistrich, A. J., and Rachlinski, J. J. (2016). Are arbitrators human? Malmendier, U., and Tate, G. (2008). Who makes acquisitions? CEO overconfidence
J. Empir. Leg. Stud. 13, 666–692. doi: 10.1111/jels.12129 and the market’s reaction. J. Financ. Econ. 89, 20–43. doi: 10.1016/j.
Hershberger, P. J., Part, H. M., Markert, R. J., Cohen, S. M., and Finger, W. W. jfineco.2007.07.002
(1994). Development of a test of cognitive bias in medical decision making. Mamede, S., van Gog, T., van den Berge, K., Rikers, R. M., van Saase, J. L.,
Acad. Med. 69, 839–842. doi: 10.1097/00001888-199410000-00014 van Guldener, C., et al. (2010). Effect of availability bias and reflective
Hinsz, V. B., and Indahl, K. E. (1995). Assimilation to anchors for damage reasoning on diagnostic accuracy among internal medicine residents. JAMA
awards in a mock civil trial. J. Appl. Soc. Psychol. 25, 991–1026. doi: 304, 1198–1203. doi: 10.1001/jama.2010.1276
10.1111/j.1559-1816.1995.tb02386.x March, J. G., and Shapira, Z. (1987). Managerial perspectives on risk and risk
Hodgkinson, G. (2001). “Cognitive processes in strategic management: some taking. Manag. Sci. 33, 1404–1418. doi: 10.1287/mnsc.33.11.1404
emerging trends and future directions,” in Handbook of Industrial, Work March, J. G., and Simon, H. A. (1958). Organizations. New York: Wiley.
and Organizational Psychology Organizational Psychology. Vol. 2. eds. N. Maule, A. J., and Hodgkinson, G. P. (2002). Heuristics, biases and strategic
Anderson, D. S. Ones and H. K. Sinangil (London: SAGE Publications decision making. Psychologist 15, 68–71.
Ltd.), 416–440. Mintzberg, H. (1983). Power In and Around Organizations. Englewood Cliffs, N.J:
Hodgkinson, G. P., Bown, N. J., Maule, A. J., Glaister, K. W., and Pearman, A. D. Prentice-Hall.
(1999). Breaking the frame: an analysis of strategic cognition and decision Mohammed, S., and Schwall, A. (2012). Individual differences and decision
making under uncertainty. Strateg. Manag. J. 20, 977–985. doi: 10.1002/(SI making: what we know and where we go from here. Int. Rev. Ind. Organ.
CI)1097-0266(199910)20:10<977::AID-SMJ58>3.0.CO;2-X Psychol. 24, 249–312. doi: 10.1002/9780470745267.ch8
Huff, A. S., and Schwenk, C. (1990). “Bias and sensemaking in good times Moore, D. A., Oesch, J. M., and Zietsma, C. (2007). What competition? Myopic
and bad,” in Mapping Strategic Thought. ed. A. S. Huff (Ed.) (Chichester, self-focus in market-entry decisions. Organ. Sci. 18, 440–454. doi: 10.1287/
England: Wiley), 89–108. orsc.1060.0243
Johnson, G. (1987). Strategic Change and the Management Process. Oxford: Moore, D. A., and Schatz, D. (2017). The three faces of overconfidence. Soc.
Basil Blackwell. Personal. Psychol. Compass 11:e122331. doi: 10.1111/spc3.12331
Joyce, E., and Biddle, G. (1981). Anchoring and adjustment in probabilistic Morewedge, C. K., Yoon, H., Scopelliti, I., Symborski, C., Korris, J., and
inference in auditing. J. Account. Res. 19, 120–145. doi: 10.2307/2490965 Kassam, K. S. (2015). Debiasing decisions: improved decision making with
Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus a single training intervention. Policy Insights Behav. Brain Sci. 2, 129–140.
and Giroux. doi: 10.1177/2372732215600886
Kahneman, D., and Frederick, S. (2002). “Representativeness revisited: attribute Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many
substitution in intuitive judgment,” in Heuristics and Biases: The Psychology guises. Rev. Gen. Psychol. 2, 175–220. doi: 10.1037/1089-2680.2.2.175
of Intuitive Judgment. T. Gilovich, D. Griffin and D. Kahneman (Eds.) O’Brien, B. (2009). Prime suspect: An examination of factors that aggravate
(Cambridge: Cambridge University Press), 103–119. and counteract confirmation bias in criminal investigations. Psychol. Public
Kahneman, D., Schkade, D., and Sunstein, C. (1998). Shared outrage and erratic Policy Law 15, 315–334. doi: 10.1037/a0017881
awards: The psychology of punitive damages. J. Risk Uncertain. 16, 49–86. Odean, T. (1998). Are investors reluctant to realize their losses? J. Financ. 53,
doi: 10.1023/A:1007710408413 1775–1798. doi: 10.1111/0022-1082.00072
Kahneman, D., Slovic, P., and Tversky, A. (Eds.) (1982). Judgment Under Odean, T. (1999). Do Investors trade too much? Am. Econ. Rev. 89, 1279–1298.
Uncertainty: Heuristics and Biases. New York: Cambridge University Press. doi: 10.1257/aer.89.5.1279
Oeberst, A., and Goeckenjan, I. (2016). When being wise after the event results Shefrin, H. (2000). Beyond Greed and Fear: Understanding Behavioral Finance
in injustice: evidence for hindsight bias in judges’ negligence assessments. and the Psychology of Investing. Boston: Harvard Business School Press.
Psychol. Public Policy Law 22, 271–279. doi: 10.1037/law0000091 Shefrin, H., and Statman, M. (1985). The disposition to sell winners too early
Ogdie, A. R., Reilly, J. B., Pang, W. G., Keddem, S., Barg, F. K., Von Feldt, J. M., and ride losers too long: theory and evidence. J. Financ. 40, 777–790. doi:
et al. (2012). Seen through their eyes: residents’ reflections on the cognitive 10.1111/j.1540-6261.1985.tb05002.x
and contextual components of diagnostic errors in medicine. Acad. Med. Shiller, R. J. (2003). From efficient markets theory to behavioral finance. J.
87, 1361–1367. doi: 10.1097/ACM.0b013e31826742c9 Econ. Perspect. 17, 83–104. doi: 10.1257/089533003321164967
Parker, A. M., and Fischhoff, B. (2005). Decision-making competence: external Stanovich, K. E., Toplak, M. E., and West, R. F. (2008). The development of
validation through an individual-differences approach. J. Behav. Decis. Mak. rational thought: a taxonomy of heuristics and biases. Adv. Child Dev. Behav.
18, 1–27. doi: 10.1002/bdm.481 36, 251–285. doi: 10.1016/S0065-2407(08)00006-2
Peer, E., and Gamliel, E. (2013). Heuristics and biases in judicial decisions. Stanovich, K. E., West, R. F., and Toplak, M. E. (2011). “Individual differences
Court Rev. 49, 114–118. as essential components of heuristics and biases research,” in The Science
Perneger, T. V., and Agoritsas, T. (2011). Doctors and patients’ susceptibility of Reason: A Festschrift for Jonathan St B. T. Evans. K. Manktelow, D. Over
to framing bias: A randomized trial. J. Gen. Intern. Med. 26, 1411–1417. and S. Elqayam (Eds.) (New York: Psychology Press), 355–396.
doi: 10.1007/s11606-011-1810-x Statman, M., Thorley, S., and Vorkink, K. (2006). Investor overconfidence and
Pohl, R. F. (2017). “Cognitive illusions,” in Cognitive Illusions: Intriguing Phenomena trading volume. Rev. Financ. Stud. 19, 1531–1565. doi: 10.1093/rfs/hhj032
in Thinking, Judgment and Memory (London; New York, NY: Routledge/ Stiegler, M. P., and Ruskin, K. J. (2012). Decision-making and safety in
Taylor&Francis Group), 3–21. anesthesiology. Curr. Opin. Anaesthesiol. 25, 724–729. doi: 10.1097/
Powell, T. C., Lovallo, D., and Fox, C. (2011). Behavioral strategy. Strateg. ACO.0b013e328359307a
Manag. J. 32, 1369–1386. doi: 10.1002/smj.968 Talpsepp, T. (2011). Reverse disposition effect of foreign investors. J. Behav.
Rachlinski, J. J. (2006). Cognitive errors, individual differences, and paternalism. Financ. 12, 183–200. doi: 10.1080/15427560.2011.606387
Univ. Chicago Law Rev. 73, 207–229. doi: 10.1093/acprof:oso/9780199211395. Tversky, A., and Kahneman, D. (1973). Availability: A heuristic for judging
003.0008 frequency and probability. Cogn. Psychol. 5, 207–232. doi: 10.1016/0010-
Rachlinski, J. J. (2018). “Judicial decision-making,” in Behavioral Law and 0285(73)90033-9
Economics. E. Zamir and D. Teichman (Eds.) (New York, NY: Oxford Tversky, A., and Kahneman, D. (1974). Judgment under uncertainty: heuristics
University Press), 525–565. and biases. Science 185, 1124–1131. doi: 10.1126/science.185.4157.1124
Rachlinski, J. J., Guthrie, C., and Wistrich, A. J. (2011). Probable cause, probability, Tversky, A., and Kahneman, D. (1981). The framing of decisions and the
and hindsight. J. Empir. Leg. Stud. 8, 72–98. doi: 10.1111/j.1740- psychology of choice. Science 211, 453–458. doi: 10.1126/science.7455683
1461.2011.01230.x Vranas, P. B. M. (2000). Gigerenzer’s normative critique of Kahneman and
Rachlinski, J. J., Guthrie, C., and Wistrich, A. J. (2013). How lawyers’ intuitions Tversky. Cognition 76, 179–193. doi: 10.1016/S0010-0277(99)00084-0
prolong litigation. South. Calif. Law Rev. 86, 571–636. Wears, R. L., and Nemeth, C. P. (2007). Replacing hindsight with insight:
Rachlinski, J. J., and Wistrich, A. J. (2017). Judging the judiciary by the numbers: toward better understanding of diagnostic failures. Ann. Emerg. Med. 49,
empirical research on judges. Ann. Rev. Law Soc. Sci. 13, 203–229. doi: 206–209. doi: 10.1016/j.annemergmed.2006.08.027
10.1146/annurev-lawsocsci-110615-085032 Weinshall-Margel, K., and Shapard, J. (2011). Overlooked factors in the analysis
Rachlinski, J. J., and Wistrich, A. J. (2018). Gains, losses, and judges: framing of parole decisions. Proc. Natl. Acad. Sci. 108:E833. doi: 10.1073/pnas.1110910108
and the judiciary. Notre Dame Law Rev. 94, 521–582. Wissler, R. L., Hart, A. J., and Saks, M. J. (1999). Decision-making about
Rachlinski, J., Wistrich, A., and Guthrie, C. (2015). Can judges make reliable general damages: A comparison of jurors, judges, and lawyers. Mich. Law
numeric judgments? Distorted damages and skewed sentences. Indiana Law Rev. 98, 751–826. doi: 10.2307/1290315
J. 90, 695–739. Zajac, E. J., and Bazerman, M. H. (1991). Blind spots in industry and competitor
Redelmeier, D. A. (2005). The cognitive psychology of missed diagnoses. Ann. analysis: implications of interfirm (mis)perceptions for strategic decisions.
Intern. Med. 142, 115–120. doi: 10.7326/0003-4819-142-2-200501180-00010 Acad. Manag. Rev. 16, 37–56. doi: 10.5465/amr.1991.4278990
Robbennolt, J. K., and Studebaker, C. A. (1999). Anchoring in the courtroom: Zamir, E., and Ritov, I. (2012). Loss aversion, omission bias, and the burden
The effects of caps on punitive damages. Law Hum. Behav. 23, 353–373. of proof in civil litigation. J. Leg. Stud. 41, 165–207. doi: 10.1086/664911
doi: 10.1023/A:1022312716354 Zwaan, L., Monteiro, S., Sherbino, J., Ilgen, J., Howey, B., and Norman, G.
Saposnik, G., Redelmeier, D., Ruff, C. C., and Tobler, P. N. (2016). Cognitive (2017). Is bias in the eye of the beholder? A vignette study to assess
biases associated with medical decisions: a systematic review. BMC Med. recognition of cognitive biases in clinical case workups. BMJ Qual. Saf. 26,
Inform. Decis. Mak. 6:138. doi: 10.1186/s12911-016-0377-1 104–110. doi: 10.1136/bmjqs-2015-005014
Schmitt, B. P., and Elstein, A. S. (1988). Patient management problems: heuristics
and biases. Med. Decs. Making 8, 224–225. Conflict of Interest: The author declares that the research was conducted in
Schnapp, B. H., Sun, J. E., Kim, J. L., Strayer, R. J., and Shah, K. H. (2018). the absence of any commercial or financial relationships that could be construed
Cognitive error in an academic emergency department. Diagnosis 5, 135–142. as a potential conflict of interest.
doi: 10.1515/dx-2018-0011
Schwenk, C. R. (1982). Dialectical inquiry in strategic decision-making: A comment Publisher’s Note: All claims expressed in this article are solely those of the
on the continuing debate. Strateg. Manag. J. 3, 371–373. doi: 10.1002/smj.4250030408 authors and do not necessarily represent those of their affiliated organizations,
Schwenk, C. R. (1984). Cognitive simplification processes in strategic decision- or those of the publisher, the editors and the reviewers. Any product that may
making. Strateg. Manag. J. 5, 111–128. doi: 10.1002/smj.4250050203 be evaluated in this article, or claim that may be made by its manufacturer, is
Schwenk, C. R. (1985). Management illusions and biases: their impact on not guaranteed or endorsed by the publisher.
strategic decisions. Long Range Plan. 18, 74–80. doi: 10.1016/0024-6301
(85)90204-3 Copyright © 2022 Berthet. This is an open-access article distributed under the
Schwenk, C. R. (1988). The cognitive perspective on strategic decision making. terms of the Creative Commons Attribution License (CC BY). The use, distribution
J. Manag. Stud. 25, 41–55. doi: 10.1111/j.1467-6486.1988.tb00021.x or reproduction in other forums is permitted, provided the original author(s) and
Sellier, A. L., Scopelliti, I., and Morewedge, C. K. (2019). Debiasing training the copyright owner(s) are credited and that the original publication in this journal
improves decision making in the field. Psychol. Sci. 30, 1371–1379. doi: is cited, in accordance with accepted academic practice. No use, distribution or
10.1177/0956797619861429 reproduction is permitted which does not comply with these terms.