Cognitivo
Cognitivo
net/publication/250815340
CITATIONS READS
64 639
4 authors, including:
Emilie M. Roth
SEE PROFILE
All content following this page was uploaded by Emilie M. Roth on 08 January 2014.
ABSTRACT: The field of cognitive engineering and decision making (CEDM) has grown
rapidly in recent decades. At this writing, it is the largest technical interest group within
the Human Factors and Ergonomics Society. Work that falls into this area of research
and study is also widely practiced across Europe and in countries around the world.
Along with this growth, there is a significant need for a peer-reviewed journal that
focuses specifically on CEDM as a science and applied discipline. The Journal of Cognitive
Engineering and Decision Making is intended to meet that need. It will cover current
research, theory, and practice in ways that not only provide for the sharing of informa-
tion across interested parties but also serve to move the field forward. This will advance
theoretical bases; address outstanding scientific challenges; set new courses and direc-
tions; address methods, measurement, and methodological issues; and show useful
applications of this work in the development of operational and training systems in many
domains that are of significance to society, government, and business. In this article,
we provide background on the CEDM field and its areas of research. We use this as a
platform for further specifying the scope of this journal along three technical tracks.
Objectives are identified for each track as it supports the overall mission of the journal.
Finally, we provide information on an electronic companion to the journal that is intended
to support advances in the field through dialogue and access to further resources.
ADDRESS CORRESPONDENCE TO: Mica Endsley, SA Technologies, 4731 E. Forest Peak, Marietta, GA
30066, mica@satechnologies.com. Visit the JCEDM Online Companion at http://cedm.webexone.com.
Journal of Cognitive Engineering and Decision Making, Volume 1, Number 1, Spring 2007, pp. 1–21
©2007 Human Factors and Ergonomics Society. All rights reserved. 1
Endsley,r4.qxd 3/22/07 4:47 PM Page 2
Hollnagel and Woods (2005) emphasize that the focus of CSE is on analyzing what
a joint cognitive system does and why it does it – in terms of the external constraints
and affordances that shape performance – rather than explaining how it does it at a
detailed cognitive processing level. This emphasis on a macrolevel analysis of the
joint cognitive system and the forces that shape its performance contrasts with some
other approaches within the CEDM field, which work to provide detailed descrip-
tions of the cognitive processes and functions that underlie performance.
processes studies of traditional psychology (e.g., memory and attention) and are
thought to be the basis for causal descriptions of how fundamental mental operations
trigger one another.
Mental Constructs
A significant amount of work in CEDM has also been conducted by a commu-
nity of practice that focuses on exploring and enhancing key mental constructs such
as situation awareness, mental workload, and mental models – all of which are seen
as fundamental contributors to effective human functioning in complex systems.
The mental construct area of CEDM includes work consistent with information-
processing models of human cognition (Wickens, 1992) but also embraces the wider
range of processes and functions examined in NDM and macrocognition research.
It contrasts with the CSE and EID approaches for describing and predicting human
interaction with complex systems by dealing more with internal aspects of cognition
and how they affect systems design.
Mental models are a cognitive construct that remains a focus of much CEDM
literature (Bainbridge, 1981; Johnson-Laird, 1980; Rasmussen, 1981; Reason, 1988;
Rouse & Morris, 1985). Capturing and representing the mental models of individu-
als has proven difficult, yet mental models are believed to play a central role in guid-
ing human interaction with complex systems. Although CEDM often advocates the
design of systems to support people’s mental models, the difficulty of specifying those
models a priori has often made this difficult to put into practice.
While the concept of mental effort dates from early experimental psychology, the
study of mental workload became a major focus in the 1970s, in particular meas-
uring and reducing mental workload in complex and demanding environments
such as aviation (Hancock & Meshkati, 1988; Hart & Sheridan, 1984; Moray, 1979).
In addition to work that addresses automation approaches to reducing mental work-
load, current work in CEDM often focuses on the design of displays and multimodal
interface approaches based on multiple resource theory (Wickens, 1992) as a means
to reduce mental workload and improve the design of systems.
Vigilance in low-workload environments has also remained an issue of research
focus for the CEDM field even though its roots can be traced back many decades
(Macworth, 1948). This work continues even today, in areas such as security screen-
ing, aviation, and military systems. The effect of automation on vigilance and com-
placency has kept this stalwart of human factors in the stream of current relevance.
Beginning in the late 1980s, a focus on situation awareness (SA) began to emerge
as a key cognitive construct of interest, springing from the terminology and chal-
lenges of the aviation field. Situation awareness, an operator’s mental representa-
tion of the world around him at any given time, forms a key construct that largely
guides moment-to-moment decision making and performance in complex systems
(Endsley, 1988). Situation awareness differs from mental models in its emphasis on
the dynamic and changing situational features that an operator must keep up with.
By contrast, mental models evolve more slowly than situation awareness, which can
change from moment to moment.
Research on situation awareness has examined how people develop and main-
tain accurate and up-to-date mental representations of the systems they operate and
the world in which they and their systems operate, as well as designing systems that
support that critical construct. In a sense, NDM picks up where SA leaves off, deal-
ing with how the decisions are made based on SA; this makes SA largely complemen-
tary with NDM’s focus on decision making. Approaches to SA research have a variety
of theoretical ties (Adams, Tenney, & Pew, 1995; Durso & Gronlund, 1999; Endsley,
1988, 1995b). Often employing a more experimental approach than NDM or CSE,
work in situation awareness, like that on mental workload, has also focused on meas-
urement as a means of furthering research on the topic and evaluation of system
designs (Endsley, 1995a; Endsley & Garland, 2000b).
Under the premise that a key means of improving decision making and perform-
ance in complex, dynamic systems lies in improving and supporting operator situ-
ation awareness, SA-oriented design (SAOD; Endsley, Bolte, & Jones, 2003) has been
developed to create systems that enhance operator SA. This system is based on
Endsley’s theoretical model of SA and the body of research on SA conducted across
many domains. SAOD provides a methodology for design that is mapped to large-scale
Automation
As our previous discussion makes clear, all of the communities of practice that
CEDM represents focus (at least in part) on the design of better information technolo-
gies to support cognitive work. This includes those who work extensively on auto-
mated systems and decision support systems (DSS), to cite two examples. Although
complex technologies and systems underlie the work that is studied in much of the
CEDM field, automation and DSS pose a particular challenge for human cognition.
They form unique (semi-) intelligent machines of their own with which operators
must interact in order to accomplish their work, thereby creating unique challenges
for the operator, who must understand and effectively interact with the automation
(Wiener & Curry, 1980). This often can create new workload of a different type,
which can be as demanding as that which is replaced (Bainbridge, 1983; Wiener,
1988). Automation usage can lead to unique problems with understanding the
automation and force the operator to develop creative approaches for dealing with
the automation – approaches that are seldom what the developer intended (Para-
suraman & Riley, 1997; Sarter & Woods, 1995; Koopman & Hoffman, 2003).
In response to these challenges, researchers in the field have focused on automa-
tion reliability and trust (Lee & Moray, 1992; Riley, 1994), on determining appropri-
ate levels of automation in order to avoid out-of-the-loop problems and to maintain
operator SA (Endsley & Kaber, 1999; Endsley & Kiris, 1995; Kaber & Endsley, 1997;
Parasuraman, Sheridan, & Wickens, 2000), and on approaches to adaptive automa-
tion that result in effective transfers of control between humans and automation (Kaber
& Endsley, 2004; Kaber & Riley, 1999; Parasuraman, 1993; Scerbo, 1996). This
broad corpus of work continues to expand as engineers develop different and more
sophisticated forms of automation. These include decision support systems and intel-
ligent agents of various kinds for a wide variety of domains, including medical sys-
tems, driving, aviation, and process control. Similar to work in the mental constructs
area, most work on automation has featured experimentation in both field settings
and simulations, although observations in domain settings have also been prominent.
Collaborative Work
A more recent focus in CEDM has been on the cognitive work of people in
teams, either colocated or distributed, who must interact and collaborate to accom-
plish their tasks. Although there is a large body of research on teams similar to that
on individuals, much of it is derived from laboratory studies with contrived tasks
and teams of undergraduate students (McGrath, 1991). The focus on collaborative
work in the CEDM field has been, rather, on how actual teams perform tasks in com-
plex, real-world settings such as aviation, medicine, and command and control
(Bolstad, Riley, Jones, & Endsley, 2002; Klein, Zsambok, & Thordsen, 1993; Orasanu,
1990; Salas, Dickinson, Converse, & Tannenbaum, 1992; Xiao, Mackenzie, & Patey,
1998). The processes and dynamics observed in these situations are often different
from those observed in laboratory studies because of real-world circumstances and
the expertise of the operators. This growing body of research clearly expands CEDM
toward the development of training and technologies to support team collabora-
tion (Bolstad & Endsley, 2005; Potter & Balthazard, 2002; Prince & Salas, 2000;
Sonnenwald & Pierce, 2000). This work incorporates a wide variety of observation-
al and experimental methods and actively contributes to the development of train-
ing programs as well as collaborative tools and shared displays for supporting shared
situation awareness and team performance.
Summary
As we have shown, these various branches of the CEDM field come together
and complement one another in often serendipitous ways. As the same phenomena
are viewed through a multitude of perspectives, the results form a mosaic that can be
pieced together to form a more complete guide for systems design and training appli-
cations than is possible through any single approach.
Future Directions
Although the foregoing account depicts the origin of CEDM and how it is prac-
ticed today, it is also worth discussing where the field should be headed in the future,
given that a key goal for the Journal of Cognitive Engineering and Decision Making is
to advance the science and its applications. First, though much of CEDM developed
from dissatisfaction with the tools, concepts, and methods of previous engineering
psychology and decision-making research, as well as the sterility of the psychology
laboratory, it is not enough to be against something. To move forward as a field of
research, CEDM must provide methods for analysis, design, and evaluation that pro-
vide guidance for improving human performance in these complex domains.
In many cases, the groundwork for these tools and methods is largely in place;
there are a variety of cognitive task analysis and modeling methods, for instance, and
experimental approaches and measures are widely available. In other areas the field
needs to move forward. Defining what is acceptable for descriptive studies of cogni-
tive phenomena is necessary, just as other areas of science (such as anthropology) have
done for their fields.
Another major area for growth will be the need to more clearly delineate how
to translate the many theories and findings of CEDM research into effective system
designs for supporting these cognitive processes. If that translation remains largely
subjective, varying from designer to designer and nongeneralizable from system to
system, then we will have failed at our central mission. For CEDM to move forward
as an effective force in system design, its theories and research must be specifiable
into clearly articulated and repeatable design guidance. Likewise, there is a need to
develop systematic approaches for translating static measures of learning and dy-
namic measures of performance that are generated by computational models into
specific methods for determining usability in interface and system design.
It is also evident that the domains of CEDM practice have expanded over the
past decade. Early concentrations featured work in aviation, air traffic control and pro-
cess control, with more recent work also encompassing the medical field, command
and control in both military and commercial applications, and unmanned vehicle
control. This shift most likely represents a broadening of the domains that are seek-
ing CEDM solutions rather than a shift in the domains that CEDM researchers think
are important.
It is likely that the domains of interest will continue to grow and change in large-
ly unpredictable ways. One of the great strengths of CEDM is its applicability across
many seemingly disparate areas, from the control of unmanned aerial vehicles to
diabetics’ self-monitoring of glucose levels. A resultant opportunity is provided by
this fact, as we will find new types of cognitive phenomena under new contexts and
unique challenges that can be solved with CEDM approaches. It also allows for the
exploration of the boundaries of our current methods and theories and provides an
advantageous path for evolution and the bridging of gaps between the various com-
munities of practice that make up CEDM.
TABLE 1. Dimensions that typify CEDM research (after Hoffman & Deffenbacher, 1993)
The Relation of Methods to the Ecology of Cognitive Work
Ecological validity Materials, tasks, and settings present events in a way that preserves
their natural forms and the natural covariation of dimensions or
cues.
Ecological relevance Materials or tasks involve things that people actually perceive or do.
Ecological salience Materials or tasks involve important things that people actually
perceive or do.
Ecological representativeness Materials or tasks involve things that people often perceive or do.
• Papers must fit the types of research considered by the Studies in Simulations
and Synthetic Environments track.
• Papers must specifically address the success or failure of an approach and
provide explanations.
• Papers must make clear the new insight provided into one of the areas of
CEDM research and how its results compare with previous research, as well
as any broader impact for advancing CEDM practice.
• Papers must provide a validation section on any new methods or an example
of the application of new theories.
Beyond these criteria, papers must provide a clear rationale for design decisions,
and the stages of the design process must flow logically. The outcomes of the re-
search must be directly linked to specific CEDM research needs. The manuscript
must demonstrate how any new simulation or synthetic environment will advance
CEDM research.
For empirical studies, papers must present a concise set of hypotheses that moti-
vate the specific experimental manipulations in a simulation. Results of the experi-
ment must be linked back to the hypotheses through concise discussion. Theory
and research modeling manuscripts must summarize a corpus of simulation or syn-
thetic environment-based studies in an area of CEDM research that supports the new
general theory. That is, all theories must be given a pedigree in existing related CEDM
theories. Manuscripts of this nature must also provide at least one detailed example
of how the new theory may serve to identify underlying factors in CEDM research
problems or explain human information processing in the context of interaction with
complex systems.
The electronic or online companion (described later) can also be used as a re-
source by authors who submit their work to the Studies in Simulations and Synthetic
Environments track, particularly for presenting dynamic content, such as videos of
simulations used in experiments. As for the Design of Complex and Joint Cognitive
Systems track, we encourage potential authors to consider the use of the electronic
companion to JCEDM.
Evaluation criteria for papers submitted to the design track will necessarily dif-
fer from evaluation criteria traditionally used in the human factors field to evaluate
research using conventional experimental design methods. Criteria for evaluating
design papers include the originality of the presented concepts, how well argued and
rigorously supported are the claims made, and how significant the contribution is to
CEDM theory and practice (see also Table 1). In the case of papers that present inno-
vative designs, additional evaluation criteria are reflected in these questions:
• Are the proposed design concepts grounded in an analysis of the domain and
work context?
• Is persuasive evidence provided for the effectiveness of the design? (This can
include laboratory or field study results.)
• Are the results generalizable beyond the specific application?
• (In the case of new methods, models, or tools for design) How innovative is
the approach? What evidence is provided that the method contributes to suc-
cessful design? (This can include presentation of illustrative case studies of its
application or formal evaluations.)
• (In the case of papers that present literature syntheses and new theory) Does
it address an important gap in the theory or practice of design? Is it likely to
stimulate dialogue among research and practitioners so as to advance the state
of the art?
This unique feature allows the Web site and JCEDM to jointly serve as a central
hub for researchers in this field. It is up to those within CEDM to supply the con-
tent and energies that will make it successful. It is a movable landscape and one that
will adapt as the field changes and grows.
What’s Next?
We invite you to enjoy this inaugural issue of the Journal of Cognitive Engineering
and Decision Making. We want you to make this journal your journal. Information on
subscribing to JCEDM and on submitting manuscripts for publication is provided
inside the printed issue (or at http://www.hfes.org/Publications/ProductDetail.aspx?
ProductID=64). If you have ideas or recommendations for JCEDM, please contact
us by e-mail or through the online companion Web site, http://cedm.webexone.com.
As editors of the journal, we stand ready, along with the JCEDM online forum, for
your contributions to the CEDM field.
Acknowledgments
The contribution of the first two authors was made possible partly through par-
ticipation in the Advanced Decision Architectures Collaborative Technology Alliance,
sponsored by the U.S. Army Research Laboratory under Cooperative Agreement
DAAD19-01-2-0009.
References
Adams, M. J., Tenney, Y. J., & Pew, R. W. (1995). Situation awareness and the cognitive manage-
ment of complex systems. Human Factors, 37, 85–104.
Allender, L., Kelley, T., Archer, S., & Adkins, R. (1997). IMPRINT: The transition and further
development of a soldier-system analysis tool. MANPRINT Quarterly, 5(1), 1–7.
Anderson, J. R., & Lebiere, C. (1998). Atomic components of thought. Mahwah, NJ: Erlbaum.
Bainbridge, L. (1981). Mathematical equations or processing routines? In J. Rasmussen & W. B.
Rouse (Eds.), Human detection and diagnosis of systems failures (pp. 259–286). New York:
Plenum.
Bainbridge, L. (1983). Ironies of automation. Automatica, 19, 775–779.
Bolstad, C. A., & Endsley, M. R. (2005). Choosing team collaboration tools: Lessons learned from
disaster recovery efforts. Ergonomics in Design, 13(4), 7–13.
Bolstad, C. A., Riley, J. M., Jones, D. G., & Endsley, M. R. (2002). Using goal-directed task analy-
sis with Army brigade officer teams. Proceedings of the Human Factors and Ergonomics Society
46th Annual Meeting (pp. 472–476). Santa Monica, CA: Human Factors and Ergonomics
Society.
Burns, C. M., & Hajdukiewicz, J. R. (2004). Ecological interface design. Boca Raton, FL: CRC Press.
Card, S. K., Moran, T. P., & Newell, A. (1983). The psychology of human-computer interaction.
Mahwah, NJ: Erlbaum.
Crandall, B., Klein, G., & Hoffman, R. R. (2006). Working minds: A practitioner’s guide to cognitive
task analysis. Cambridge, MA: MIT Press.
deGroot, A. (1965). Thought and choice in chess. Hague, Netherlands: Mouton.
Dreyfus, H. L. (1979). What computers can’t do: The limits of artificial intelligence (2nd ed.). New
York: Harper & Row.
Dreyfus, S. E. (1981). Formal models vs. human situational understanding: Inherent limitations on the
modeling of business expertise (No. ORC 81-3). Berkeley: Operations Research Center, Uni-
versity of California.
Durso, F. T., & Gronlund, S. D. (1999). Situation awareness. In F. T. Durso, R. Nickerson, R.
Schvaneveldt, S. Dumais, S. Lindsay, & M. Chi (Eds.), Handbook of applied cognition (pp.
284–314). New York: Wiley.
Endsley, M. R. (1988). A construct and its measurement: The functioning and evaluation of pilot situ-
ation awareness (No. NOR DOC 88-30). Hawthorne, CA: Northrop Corporation.
Endsley, M. R. (1995a). Measurement of situation awareness in dynamic systems. Human Factors,
37, 65–84.
Endsley, M. R. (1995b). Toward a theory of situation awareness in dynamic systems. Human Factors,
37, 32–64.
Endsley, M. R., Bolte, B., & Jones, D. G. (2003). Designing for situation awareness: An approach to
human-centered design. London: Taylor & Francis.
Endsley, M. R., & Garland, D. G. (2000a). Pilot situation awareness training in general aviation.
Proceedings of the 14th Triennial Congress of the International Ergonomics Association and the 44th
Annual Meeting of the Human Factors and Ergonomics Society (pp. 2.357–2.360). Santa Monica,
CA: Human Factors and Ergonomics Society.
Endsley, M. R., & Garland, D. J. (Eds.). (2000b). Situation awareness analysis and measurement.
Mahwah, NJ: Erlbaum.
Endsley, M. R., & Kaber, D. B. (1999). Level of automation effects on performance, situation aware-
ness and workload in a dynamic control task. Ergonomics, 42, 462–492.
Endsley, M. R., & Kiris, E. O. (1995). The out-of-the-loop performance problem and level of con-
trol in automation. Human Factors, 37, 381–394.
Endsley, M. R., & Robertson, M. M. (2000). Training for situation awareness in individuals and
teams. In M. R. Endsley & D. J. Garland (Eds.), Situation awareness analysis and measurement.
Mahwah, NJ: Erlbaum.
Flach, J. M. (1990). The ecology of human-machine systems I: Introduction. Ecological Psychology,
2(3), 191–205.
Klein, G. A., Calderwood, R., & Clinton-Cirocco, A. (1986). Rapid decision making on the fire
ground. Proceedings of the Human Factors Society 30th Annual Meeting (pp. 576–580). Santa
Monica, CA: Human Factors and Ergonomics Society.
Klein, G. A., Zsambok, C. E., & Thordsen, M. L. (1993, April). Team decision training: Five myths
and a model. Military Review, 36–42.
Koopman, P., & Hoffman, R. R., (November/December 2003). Work-Arounds, Make-Work, and
Kludges. IEEE: Intelligent Systems, 70–75.
Kuhn, T. (1970). The structure of scientific revolutions (2nd ed.). Chicago: University of Chicago Press.
Lee, J., & Moray, N. (1992). Trust, control strategies and allocation of function in human-
machine systems. Ergonomics, 35(10), 1243–1270.
Macworth, N. H. (1948). The breakdown of vigilance during prolonged visual search. Quarterly
Journal of Experimental Psychology, 1, 5–61.
McGrath, J. E. (1991). Time, interaction and performance (TIP): A theory of groups. Small Group
Research, 22, 147–174.
Meyer, D. E., & Kieras, D. E. (1997). A computational theory of executive control processes and
human multiple task performance: Part 1: Basic mechanisms. Psychological Review, 104, 3–65.
Mintzburg, H. (1973). The nature of managerial work. New York: Harper and Row.
Moray, N. (Ed.). (1979). Mental workload: Its theory and measurement. New York: Plenum.
Newell, A. (1990). Unified theories of cognition. Cambridge, MA: Harvard University Press.
Norman, D. A. (1981). Steps towards a cognitive engineering (Tech. Report). San Diego: University
of California, Program in Cognitive Science.
Norman, D. A. (1986). Cognitive engineering. In D. A. Norman & S. W. Draper (Eds.), User-
centered system design (pp. 31–61). Mahwah, NJ: Erlbaum.
Orasanu, J. (1990, July). Shared mental models and crew decision making. Proceedings of the 12th
Annual Conference of the Cognitive Science Society. Cambridge, MA: Cognitive Science Society.
Parasuraman, R. (1993). Effects of adaptive function allocation on human performance. In D. J.
Garland & J. A. Wise (Eds.), Human factors and advanced aviation technologies (pp. 147–158).
Daytona Beach, FL: Embry-Riddle Aeronautical University Press.
Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse and abuse.
Human Factors, 39, 230–253.
Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model of types and levels of human
interaction with automation. IEEE Transactions on Systems, Man and Cybernetics, 30, 286–297.
Potter, R. E., & Balthazard, P. A. (2002). Virtual team interaction styles: Assessment and effects.
International Journal of Human-Computer Studies, 56, 423–443.
Potter, S. S., Elm, W. C., Roth, E. M., Gualtieri, J., & Easter, J. (2002). Bridging the gap between
cognitive analysis and effective decision aiding. In M. D. McNeese & M. A. Vidulich (Eds.),
State of the art report (SOAR): Cognitive systems engineering in military aviation environments:
Avoiding cogminutia fragmentosa (pp. 137–168).Wright-Patterson AFB, OH: Human Systems
Information Analysis Center.
Prince, C., & Salas, E. (2000). Team situation awareness, errors, and crew resource management:
Research integration for training guidance. In M. R. Endsley & D. J. Garland (Eds.), Situation
awareness analysis and measurement (pp. 325–347). Mahwah, NJ: Erlbaum.
Rasmussen, J. (1981). Models of mental strategies in process plant diagnosis. In J. Rasmussen &
W. B. Rouse (Eds.), Human detection and diagnosis of system failures (pp. 241–258). New York:
Plenum.
Rasmussen, J., Pejtersen, A. M., & Goodstein, L. P. (1994). Cognitive systems engineering. New York:
Wiley.
Rasmussen, J., & Vicente, K. J. (1989). Coping with human errors through system design: Impli-
cations for ecological interface design. International Journal of Man-Machine Studies, 31,
517–534.
Reason, J. (1988). Framework models for human performance and error: A consumer’s guide.
In L. P. Goodstein, H. B. Andersen, & S. E. Olsen (Eds.), Tasks, errors and mental models (pp.
35–49). London: Taylor & Francis.
Riley, J. M., Kaber, D. B., Hyatt, J., Sheik-Nainar, M., Reynolds, J., & Endsley, M. (2005). Measures
for assessing situation awareness in virtual environment training of infantry squads (Final Report
No. SATech-05-03). Marietta, GA: SA Technologies.
Riley, V. (1994). A theory of operator reliance on automation. In M. Mouloua & R. Parasuraman
(Eds.), Human performance in automated systems: Current research and trends (pp. 8–14).
Mahwah, NJ: Erlbaum.
Rouse, W. B., & Morris, N. M. (1985). On looking into the black box: Prospects and limits in the search
for mental models (No. DTIC #AD-A159080). Atlanta, GA: Center for Man-Machine Systems
Research, Georgia Institute of Technology.
Salas, E., Dickinson, T. L., Converse, S., & Tannenbaum, S. I. (1992). Toward an understanding of
team performance and training. In R. W. Swezey & E. Salas (Eds.), Teams: Their training and
performance (pp. 3–29). Norwood, NJ: Ablex.
Salvucci, D. D., & Lee, F. J. (2003). Simple cognitive modeling in a complex cognitive architecture.
Proceedings of the Human Factors in Computing Systems: CHI 2003 (pp. 189–194). New York:
ACM Press.
Sanders, M. S., & McCormick, E. J. (1992). Human factors in engineering and design (7th ed.). New
York: McGraw-Hill.
Sarter, N. B., & Woods, D. D. (1995). “How in the world did I ever get into that mode”: Mode
error and awareness in supervisory control. Human Factors, 37, 5–19.
Scerbo, M. W. (1996). Theoretical perspectives on adaptive automation. In R. Parasuraman &
M. Mouloua (Eds.), Automation and human performance: Theory and application (pp. 37–63).
Mahwah, NJ: Erlbaum.
Sheridan, T. (1992). Telerobotics, automation and human supervisory control. Cambridge, MA: MIT
Press.
Sonnenwald, D. H., & Pierce, L. G. (2000). Information behavior in dynamic group work con-
texts: Interwoven situational awareness, dense social networks and contested collaboration
in command and control. Information Processing and Management, 36, 461–479.
St. Armant, R., & Ritter, F. E. (2004). Automated GOMS-to-ACTR model translation. Proceedings
of the International Conference on Cognitive Modeling (pp. 189–194). New York: ACM Press.
Strater, L. D., Jones, D., & Endsley, M. R. (2003). Improving SA: Training challenges for infantry
platoon leaders. Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting
(pp. 2045–2049). Santa Monica, CA: Human Factors and Ergonomics Society.
Vicente, K. (1999). Cognitive work analysis: Towards safe, productive and healthy computer-based
work. Mahwah, NJ: Erlbaum.
Wampler, J., Roth, E., Whitaker, R., Kendall, C., Stilson, M., Thomas-Meyers, G., and Scott, R.
(2006). Using work-centered specifications to integrate cognitive requirements into software
development. Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting
(pp. 240–244). Santa Monica, CA: Human Factors and Ergonomics Society.
Wickens, C. D. (1992). Engineering psychology and human performance (2nd ed.). New York: Harper
Collins.
Wiener, E. L. (1988). Cockpit automation. In E. L. Wiener & D. C. Nagel (Eds.), Human factors in
aviation (pp. 433–461). San Diego, CA: Academic.
Wiener, E. L., & Curry, R. E. (1980). Flight deck automation: Promises and problems. Ergonomics,
23, 995–1011.
Woods, D. D., & Hollnagel, E. (2006). Joint Cognitive Systems: Patterns in Cognitive Systems. Boca
Raton, FL: CRC Press.
Woods, D. D., & Roth, E. M. (1988). Cognitive engineering: Human problem solving with tools.
Human Factors, 30, 415–430.
Xiao, Y., Mackenzie, C. F., & Patey, R. (1998). Team coordination and breakdowns in a real-life
stressful environment. Proceedings of the Human Factors and Ergonomics Society 42nd Annual
Meeting (pp. 186–190). Santa Monica, CA: Human Factors and Ergonomics Society.
Zacharias, G., Miao, A., Illgen, C., Yara, J., & Siouris, G. (1996). SAMPLE: Situation awareness
model for pilot-in-the-loop evaluation. Proceedings of the First Annual Conference on Situation
Awareness in the Tactical Air Environment. Patuxent River: Naval Air Warfare Center.
Mica R. Endsley is president of SA Technologies, 3750 Palladian Village Dr., Building 600,
Marietta, GA 30066, mica@satechnologies.com. Dr. Endsley was a visiting associate professor
at MIT in the Department of Aeronautics and Astronautics and associate professor of indus-
trial engineering at Texas Tech University. She received a Ph.D. in industrial and systems engi-
neering from the University of Southern California. She has published extensively in the areas
of situation awareness, decision making and automation, and is coauthor of Designing for
Situation Awareness.
Robert R. Hoffman is a senior research scientist at the Institute for Human and Machine
Cognition in Pensacola, FL. He is a Fellow of the American Psychological Society, a Fulbright
Scholar, and an Honorary Fellow of The Eccles Center for American Studies of the British
Library. He received his B.A., M. A., and Ph.D. in experimental psychology at the University of
Cincinnati. His latest books are Working Minds: A Practitioner’s Handbook of Cognitive Task
Analysis, Expertise Out of Context, Minding the Weather: How Expert Forecasters Think, and The
Cambridge Handbook of Expertise and Expert Performance.
David B. Ka ber is an associate professor in the Edward P. Fitts Department of Industrial &
Systems Engineering and an associate faculty in the Department of Psychology at North
Carolina State University. He received his Ph.D. in industrial engineering from Texas Tech
University in 1996.
Emilie Roth is a cognitive psychologist (Ph.D. from the University of Illinois at Champaign-
Urbana, 1980). She is principal scientist of Roth Cognitive Engineering. She has been involved
in cognitive systems analysis and design in a variety of domains, including nuclear power plant
operations, railroad operations, military command and control, medical operating rooms,
and intelligence analysis.