Lidtfoundations
Lidtfoundations
Richard E. West
                             Version: 1.38
This book is provided freely to you by
          EdTechBooks.org
CC BY: This book is released under a CC BY license, which means that you are free to do
with it as you please as long as you properly attribute it.
                                       Table of Contents
     Acknowledgements ...................................................................................................... 2
     Introduction .................................................................................................................. 3
     List of Authors .............................................................................................................. 7
I. Definitions and History ................................................................................................. 10
     The Proper Way to Become an Instructional Technologist ................................. 11
     What Is This Thing Called Instructional Design? .................................................. 25
     History of LIDT ............................................................................................................ 30
     A Short History of the Learning Sciences .............................................................. 36
     LIDT Timeline .............................................................................................................. 53
     Twenty Years of EdTech ............................................................................................ 54
II. Learning and Instruction ............................................................................................. 70
     Memory ........................................................................................................................ 71
     Intelligence ................................................................................................................. 87
     Behaviorism, Cognitivism, Constructivism ............................................................ 99
     Sociocultural Perspectives of Learning ................................................................ 124
     Learning Communities ............................................................................................ 147
     Communities of Innovation .................................................................................... 162
     Motivation Theories and Instructional Design .................................................... 183
     Informal Learning ..................................................................................................... 201
     Overview of Problem-Based Learning .................................................................. 216
     Connectivism ............................................................................................................ 230
     An Instructional Theory for the Post-Industrial Age .......................................... 241
     Using the First Principles of Instruction to Make Instruction Effective, Efficient,
         and Engaging ..................................................................................................... 254
III. Design ......................................................................................................................... 268
     Instructional Design Models ................................................................................... 269
     Design Thinking and Agile Design ......................................................................... 291
     What and how do designers design? .................................................................... 310
     The Development of Design-Based Research ...................................................... 325
     A Survey of Educational Change Models .............................................................. 347
     Performance Technology ........................................................................................ 355
     Defining and Differentiating the Makerspace ..................................................... 372
     User Experience Design .......................................................................................... 384
IV. Technology and Media .............................................................................................. 412
     Technology Integration in Schools ........................................................................ 413
     K-12 Technology Frameworks ................................................................................ 445
     The Learner-Centered Paradigm of Education .................................................... 455
     Distance Learning .................................................................................................... 472
     Open Educational Resources .................................................................................. 492
     The Value of Serious Play ....................................................................................... 503
    Video Games and the Future of Learning ............................................................ 524
    Educational Data Mining and Learning Analytics ............................................... 543
    Opportunities and Challenges with Digital Open Badges ................................. 561
V. Becoming an LIDT Professional ................................................................................ 572
    The Moral Dimensions of Instructional Design ................................................... 573
    Creating an Intentional Web Presence ................................................................. 586
    Where Should Educational Technologists Publish Their Research? ................ 611
    Rigor, Influence, and Prestige in Academic Publishing ..................................... 630
    Networking at Conferences .................................................................................... 641
VI. Preparing for an LIDT Career .................................................................................. 649
    What Are the Skills of an Instructional Designer? .............................................. 650
Final Reading Assignment ............................................................................................. 671
Back Matter ...................................................................................................................... 672
    Author Information .................................................................................................. 673
    Citation Information ................................................................................................ 674
                         Acknowledgements
Producing this textbook would not have been possible on my own. I am deeply grateful to
Tanya Gheen, who skillfully served as copyeditor and layout designer, and Karen Arnesen
who also served as copyeditor. Their skill in editing, as well as understanding of the field,
gave valuable insights that I heeded in making many editorial decisions. Joshua Hveem and
Jiahui Zhang also provided valuable assistance with formatting chapters, reproducing visual
elements, and adding extra material to enhance the text of each chapter. I am also grateful
to my colleague Royce Kimmons for creating edtechbooks.org as the primary platform for
hosting this and other OER books.
I am also grateful to the authors of the chapters, particularly those who authored new
chapters for this book, because without quality content, there could have been no book.
                                              2
                                 Introduction
Like most, I had a serendipitous beginning to my career in this field. I knew I loved to teach
but did not know what subject. I loved to read and study theory as a literature major but did
not want to spend my life writing another literary analysis of Chaucer. I loved to write as a
former newspaper reporter and use visual design technologies to lay out newspapers but
knew that was not quite right either. What was the answer? Luckily for me, a colleague
mentioned instructional design, and I jumped feet first into a field that I knew very little
about.
I’ve learned over the years that my experience is more common than not, as there is not “a
proper way” (see Lloyd Rieber’s Peter Dean lecture, republished in this book) to come into
this field. People with a wide rainbow of academic and professional backgrounds come into
this field and leave to occupy a similarly wide variety of employment options. For this
reason, many have called our field a “meta” field that is integrated into many other
disciplines. For what could be more ubiquitous than the need to educate? And where there
is education, there must be designers to create it.
I repeat, this book will not cover everything a student in the field should know. No book will.
What I do hope, however, is that this book will provide enough of an overview of the key
topics, discussions, authors, and vocabulary in the field that you will be able to start
navigating and understanding other books, articles, and conference presentations as you
continue your educational journey. I also hope to spark an interest in studying more on any
one of these topics that may be interesting to you. There are rich bodies of literature
underneath each of these topics, just waiting to be explored.
What's in a Name?
Scholars disagree on what we should even call our field. In the textbooks I mentioned above,
our field is called educational technology, instructional design, and instructional design and
technology. My academic department is called the Department of Instructional Psychology
and Technology, although it used to go by the Department of Instructional Science. In this
                                              3
                Foundations of Learning and Instructional Design Technology
book I have sought for what I considered to be the most inclusive name: learning and
instructional design technology (LIDT). I chose this also to emphasize that as designers and
technologists, we not only affect instruction but also learners and learning environments. In
fact, sometimes, that may be our greatest work.
The third section of the book focuses on current trends and issues, using the concept of
“current” fairly liberally. Here we review topics such as the learning sciences, online
learning, design-based research, K-12 technology integration, instructional gaming, and
school reform. The fourth and fifth sections of the book I consider to represent the future of
the field—or the future of you, the student just beginning your career! You are the future of
the field, so this section of the book is dedicated to you. In it, you will find chapters related
to successfully navigating graduate school, launching your career, and integrating yourself
into the professional community.
                                                4
               Foundations of Learning and Instructional Design Technology
I ask that in any remixing of the book, that you please acknowledge the original version of
the book and follow the individual copyright license for each chapter, as some chapters
were only published in this version of the book by permission of the copyright holders.
Finally, I would be interested in hearing about any great new content that you find or
develop for your versions of the book too.
Attribution
    The following is potential wording you could use in your remixed version of the
    book: “This textbook is a revision of Foundations of Learning and Instructional
    Design Technology, available at https://edtechbooks.org/-poI edited by Dr. Richard
    E. West [http://richardewest.com/] of Brigham Young University
    [https://home.byu.edu/home/].”
Reflection
    What do you hope to learn from this textbook? Write down any questions you have
    about the field and as you read through the chapters, note any answers you may
    have found or add any additional questions.
                                               5
               Foundations of Learning and Instructional Design Technology
For the reader’s information, all articles from Educational Technology magazine are
republished by permission of the editor and publisher, who holds the rights to the material.
Some material was available open access, but not with a Creative Commons license, and we
have been granted permission to republish these articles. Other articles were already
available under various Creative Commons licenses [https://edtechbooks.org/-JMt]. ERIC
Digest material is licensed public domain. As a reader, please notice and honor these
various licenses and permissions listed for each chapter.
To cite a chapter from this book in APA, please use this format:
Authors. 2018. Title of chapter. In R. West (Ed.), Foundations of Learning and Instructional
Design Technology (1st ed.). Available at https://edtechbooks.org/lidtfoundations.
[https://edtechbooks.org/-guw]
                                              6
                              List of Authors
Following is the list of authors for chapters in this book, with their most recent known
affiliation.
                 Name                                        Affiliation
Abbie H. Brown                       East Carolina University
Abby Hawkins                         Academy Mortgage Corporation
AECT                                 http://www.aect.org
Albert D. Ritzhaupt                  University of Florida
Amanda R. Casto                      University of North Carolina at Charlotte
Ana Donaldson                        University of Northern Iowa
Andrew A. Tawfik                     University of Memphis
Andrew Gibbons                       Brigham Young University
Arlene Lacombe                       Saint Joseph's University
Barbara Lockee                       Virginia Tech
Barbara M. Hall                      Northcentral University
Beth Oyarzun                         University of North Carolina at Charlotte
Bohdana Allman                       Brigham Young University
Brent Hoard                          Randolph-Macon College
Charles M. Reigeluth                 Indiana University
Christopher D. Sessums               xlm-design
David Mike Moore                     Virginia Tech
David Noah
David Wiley                          Lumen Learning
David Williamson Shaffer             University of Wisconsin-Madison
Drew Polly                           University of North Carolina at Charlotte
Ellen Wagner                         Hobsons
Florence Martin                      University of North Carolina at Charlotte
George Siemens                       University of Texas at Arlington
Gregory S. Williams                  Intuit
James B. Ellsworth                   U.S. Naval War College
James Jacob                          University of Pittsburgh
James P. Gee                         University of Wisconsin-Madison
Jered Borup                          George Mason University
Jessica Norwood                      University of North Carolina at Charlotte
Jill Stefaniak                       University of Georgia
                                               7
                 Foundations of Learning and Instructional Design Technology
                 Name                                        Affiliation
Joanna C. Dunlap                     University of Colorado-Denver
John Burton                          Virginia Tech
John R. Savery                       University of Akron
Kari Ross Nelson                     Thanksgiving Point
Kathryn Dumper                       Bainbridge State College
Kay Persichitte                      University of Wyoming
Kelvin Seifert                       University of Manitoba
Kimberly Christensen                 ConsultNet
Kurt Squire                          University of Wisconsin-Madison
Lee Daniels                          Kingsport, Tennessee
Lloyd Rieber                         University of Georgia
Lola Smith
Lorie Millward                       Thanksgiving Point
M. David Merrill                     Utah State University
Margeaux C. Johnson                  University of Florida
Marilyn Lovett                       Livingstone College
Marion Perlmutter                    University of Michigan
Martin Weller                        Open University of the United Kingdom
Matthew J. Koehler                   Michigan State University
Matthew Schmidt                      University of Cincinnati
Michael Molenda                      Indiana University
Patricia Stitson                     California International Business University
Patrick R. Lowenthal                 Boise State University
Paul Salvador Inventado              Carnegie Mellon University
Peggy A. Ertmer                      Purdue University
Peter J. Rich                        Brigham Young University
Punya Mishra                         Arizona State University
Randall S. Davies                    Brigham Young University
Richard E. West                      Brigham Young University
Richard Halverson                    University of Wisconsin-Madison
Richard Osguthorpe                   Boise State University
                                     State of Connecticut Department of Developmental
Rose Spielman
                                     Services
Rosemary Sutton                      Cascadia College
Royce Kimmons                        Brigham Young University
Russell T. Osguthorpe                Brigham Young University
Ryan S. Baker                        University of Pennsylvania
                                             8
                Foundations of Learning and Instructional Design Technology
                Name                                        Affiliation
Sang Joon Lee                       Mississippi State University
Seung Won Park                      Daejeon University
Sharon Smaldino                     Northern Illinois University
Stephen Ashton                      Thanksgiving Point
Sunnie Lee Watson                   Purdue University
Tadd Farmer                         Purdue University
Tanya Gheen                         Brigham Young University
Thomas Reeves                       University of Georgia
Tim Boileau                         University of West Florida
Timothy Newby                       Purdue University
Tonia A. Dousay                     University of Idaho
U.S. Office of Educational
                                    U.S. Department of Education
Technology
Vanessa Svihla                      University of New Mexico
Victor Lee                          Utah State University
William Jenkins                     Mercer University
William Sugar                       East Carolina University
Yvonne Earnshaw                     Independent Consultant
                                            9
                   I. Definitions and History
The ritual is a common one, every fall semester. Students knock on my door, introduce
themselves as interested in studying Learning and Instructional Design Technology for a
graduate degree, ask how they can prepare themselves. Should they study psychology for
their undergraduate degree? Education? Sociology? Media and technology? Research
methods? Design of some sort?
The answer would be, of course, yes! But this does not mean one must know everything to
be successful in LIDT. Rather, this means that there are many successful and "proper" paths
into our field. Lloyd Rieber explains this very well in his Peter Dean Lecture essay that is the
first chapter of this section and book. I find that this essay often puts students at ease,
explaining that whatever their path might have been, they belong in the field.
This section also includes several chapters on the history of the LIDT field. Because the field
of LIDT could be defined broadly, any aspect of the history of education and learning could
be considered a history of this field. However, there is generally consensus that the field of
LIDT began in earnest with the development of digital technologies, programmed
instruction, and systemic thinking, and then grew to include newer developments such as
the learning sciences and evolving perspectives on teaching and learning. These points of
view are reflected in these chapters, but students are encouraged to think about the history
of the field more broadly as well. What perspectives are not included in these historical
chapters that should be? What other theories, ideas, and voices helped to form a foundation
for how we look at the field of LIDT?
                                              10
                                               1
Lloyd Rieber
Editor's Note
    In 1998, Rieber was invited to give the 1998 Peter Dean Lecture for AECT and later
    published his remarks on Rieber’s own website [https://edtechbooks.org/-An]. It is
    republished here by permission of the author.
Prologue
I wrote this essay to support my Peter Dean Lecture at the 1998 AECT convention. The
invitation to present this lecture came only several weeks before the scheduled presentation
at AECT. Consequently, there was little time to put these ideas into written form for the
ITFORUM discussion [http://it.coe.uga.edu/itforum] that traditionally follows this lecture.
Interestingly, though the lecture and discussion have long past, I have not felt it necessary
to revise the essay. Despite the fact that it lacks the “scholarship polish” of a refined work, I
think it still captures well my thoughts and feelings that I initially struggled to organize and
convey. I presented my essay and conducted the ITFORUM discussion in the spirit of
sharing some ideas as works in progress. I think this is a style that takes full advantage of
electronic media—to offer a set of ideas that lead to more questions than answers and to
engage a group of thoughtful people in a discussion of the ideas to tease out what is and is
not important.
The purpose of the Peter Dean Lecture, as I understand it, is to choose someone who has
been around long enough to appreciate the struggles of the field and to give that person the
                                               11
               Foundations of Learning and Instructional Design Technology
opportunity to give a critical analysis of where we are and where we might go. This presents
a nice opportunity, but a presumptuous one in my opinion, for the person chosen. Are my
experiences and points of view a valid cross section of the field? Obviously not.
Nevertheless, I used this opportunity to speak to some issues that interest and concern me,
in the hope that they might trigger some reflection and comment—I still hope that is the
case for those who now happen upon this essay.
Introduction
The inspiration for the topic of my AECT presentation and this essay comes from an article
published by Robert Heinich in 1984 called “The Proper Study of Educational Technology.”
At the time I first read the article (around 1986), its title rubbed me the wrong way. There
was something unduly pretentious about it—that there was, in fact, a proper study of
instructional technology (IT). When I first read the article, I must admit that I incorrectly
interpreted it. Heinich warned strongly against the “craft” of IT which I wrongly interpreted
as “art.” I have long been sensitive to our field disavowing the artistic side of IT and instead
overemphasizing, I felt, its scientific aspects. Having just reread the article, I am very
impressed with how forward looking Heinich’s thinking was at the time, especially
regarding the role of research. The purpose of this essay, therefore, is not to take issue with
Heinich’s ideas, but to use them to motivate another question: What is the proper way to
become an instructional technologist?
                                               12
               Foundations of Learning and Instructional Design Technology
their cultures. I took several paths from there, at one point actually completing the
paperwork to declare a major in anthropology. I came to the education field most
unexpectantly. I eventually became an elementary school teacher—trained in a large urban
university in the northeast of the USA, but got my first job in a very small rural school in the
American Southwest (New Mexico). This was 1980 which, coincidentally, was about the time
that desktop computers were introduced into mainstream education. I found myself thrust
into a position where technology, education, and different cultures were rapidly mixing.
In a lot of ways, this was perfect position for a person like me. There were few formal ideas
in force about how to use computers in education (at least in my district) and the school
administration actually encouraged “early adopters” such as myself to explore different
ideas and take some risks. I later discovered, when I entered graduate school, that many of
the things I had learned on my own in those years about technology, instructional design,
and learning theory actually had formal names in the literature (one example is the concept
of rapid prototyping).
Elementary school teachers are, as a group, very sensitive to the student point of view
(though don’t take this as an insult to other groups). It’s just that the complexity of domains
(e.g. math, science, language arts, etc.) is not as demanding to the adult as they are to the
student. Consequently, the adult teacher is somewhat freed from the demands of the
content, but forced to consider what it must be like for a 10 year old to learn something like
fractions. Most elementary school teachers are also faced with teaching a broad array of
subjects, so the concept of integrating subjects in meaningful ways is familiar to us. (Heck, I
also taught music—the elementary school was one of the few places where my accordion
was truly appreciated!) My education to become a teacher was heavily rooted in Piagetian
learning theory, so it is easy to see how I came to use LOGO with students and to
understand the facilitative role it demanded of teachers. In hindsight, I can’t think of a
better place than the elementary school classroom for me to have received my first
education as an Instructional Technologist. (“Holmes, my good man, what school did you
attend to become an Instructional Technologist?” “Elementary, my dear Watson,
elementary.”) I wonder how many of you have backgrounds exactly like mine. Few, I wager.
So, while studying engineering and culture, traveling, followed by being an elementary
classroom teacher in a context where technology was introduced with no training was the
proper way for me to become an instructional technologist, I know it is a path not to be
exactly duplicated by anyone.
                                              13
               Foundations of Learning and Instructional Design Technology
the tools first and to assume that the knowledge of how to apply the tool in education will
come merely as a consequence. However, I like to point out that “A power saw does not a
carpenter make.” Owning a power saw coupled even with the knowledge of how to use it
safely to cut wood does not make one a carpenter. For example, consider the contrast
between two carpenters who appear on American public television shows—Roy Underhill
and Norm Abrams. For those of you not familiar with these two, Roy appears on The
Woodwright’s Shop, a show dedicated to preserving carpentry skills practiced before the
advent of electricity. In contrast, Norm Abrams, a self-professed power tool “junkee,”
appears in This Old House and The New Yankee Workshop. Despite their different
approaches and attitudes to the use of technology, I’m quite sure that both would
thoroughly enjoy the other’s company and wile away the hours discussing what they both
love best, namely, carpentry. However, despite my respect for Roy’s skills and philosophy,
when I try my hand at even mid-size woodworking projects, such as building a patio deck or
storage shed, you can bet that I prefer to use the power tools available to me. Likewise, in
education, I prefer to take advantage of the opportunities that the available “power tools”
afford, such as the computer. But underlying it all, is a profound core of, and respect for, the
essential skills, strategies, and experiences akin to those possessed by the master carpenter.
The debate between objectivism vs. constructivism, though a healthy and necessary one, has
also had the tendency for people to believe that there is a “right answer” to what their
philosophy “should be.” It’s almost as though they were taking some sort of test that they
need to pass. I suppose most just want to be associated with the dominant paradigm instead
of digging down deep to better understand their own values, beliefs, and biases. I’ve also
noticed it is in vogue to question others about their philosophical camp, not in order to enter
into a dialogue of how one’s philosophy informs one’s design, but more to sort people in a
convenient manner. (This resembles to me how Dorothy was questioned by Glinda: “Are you
a good witch or a bad witch?” The answer, of course is “Why, I’m not a witch at all.”)
                                              14
               Foundations of Learning and Instructional Design Technology
Not being a philosopher, I have found it difficult to effectively raise and lead discussions on
philosophical issues in my classes. I had always joked about wanting some sort of simulation
that embedded these issues in a way that one could “experience” them rather than just talk
about them. You know, something like ‘SimCity’ or ‘SimLife.’ Wouldn’t it be great, I thought,
to also have a similar simulation to help one play with these complicated issues as well as
understand what the educational system would be like 50 or 100 years from now if a major
paradigm shift really took place today! Ha ha, it was a quaint inside joke. Well, one day I
decided to put a working prototype of ‘SimSchool’ together for my next class. I have
“shocked” Simschool [https://edtechbooks.org/-KmE] and offer it here for you to try out (of
course, you will first need access to the web, have enough RAM, and be able to download
and install the right plug-in from Macromedia, etc.). If you do take a look, don’t take it too
seriously. This simulation has not been validated. It is just a little exercise to get my
students to “try out” the philosophical implications on education, from my point of view.
What is most useful is when people take issue with my interpretation and instead put
forward how THEY would design SimSchool. These are the discussions that really matter.
                                              15
               Foundations of Learning and Instructional Design Technology
One physicist I have become fascinated with is the late Nobel laureate, Richard Feynman.
Some of you might know him due to his role on the committee investigating the Space
Shuttle Challenger disaster. (My daughter was in first grade at the time. The whole school
was gathered in the school’s cafeteria to watch the lift-off. I recall my daughter coming
home after school telling us that it was her job to go find the principal to tell her that the
“shuttle blew up.”) I have become interested in Feynman for lots of reasons, but of
relevancy here was his apparent genuine concern about his teaching. While other physicists
and mathematicians-turned-educators often come across to me thinking they know all the
answers to the problems of education—I’m not saying Papert and Bork are like this
;)—Feynman remained quite reflective (not to mention baffled) by the entire
teaching/learning process. In the preface to The Feynman Lectures, a set of well-known
readings to introductory physics, Feynman expressed his frustration in not being able to
meet the needs of students known not to be the brightest or most motivated (in other words,
those like me). Rather than just blame the students, he publicly took his share of the
responsibility.
                                              16
               Foundations of Learning and Instructional Design Technology
      think about; if you can’t think of a new thought, no harm done; what you thought
      about it before is good enough for the class. If you do think of something new,
      you’re rather pleased that you have a new way of looking at it. The questions of
      the students are often the source of new research. They often ask profound
      questions that I’ve thought about at times and then given up on, so to speak, for
      a while. It wouldn’t do me any harm to think about them again and see if I can
      go any further now. The students may not be able to see the thing I want to
      answer, or the subtleties I want to think about, but they remind me of a problem
      by asking questions in the neighborhood of that problem. It’s not so easy to
      remind yourself of these things. So I find that teaching and the students keep
      life going, and I would never accept any position in which somebody has
      invented a happy situation for me where I don’t have to teach. Never.
While I don’t know how much his students may have learned, his willingness to admit how
vital teaching was to his own professional development is refreshingly straightforward.
One of the most problematic relationships in our field is that which exists between theory,
research, and practice. The problem is shared by professors, researchers, students, and
practitioners alike. That is, a professor who is unable (or unwilling) to connect theory with
practice is just as guilty as a student who avoids confronting or demeans theoretical
implications of practice. The textbooks make the relationship seem so clear and
straightforward, yet my experience with actually doing instructional design has been messy
and very idiosyncratic. Michael Streibel (1991, p. 12) well articulated what I had felt as I
tried to reconcile instructional design as it was written and talked about versus how I had
actually done it:
                                              17
               Foundations of Learning and Instructional Design Technology
My current interest in play theory is also an example of my struggle with how our field
characterizes the interplay between research, theory, and practice. On one hand, my
interest in play is derived by working with children and watching the intensity with which
they engage in activities they perceive to be worthwhile. However, I also wanted to explain
more thoroughly my own experiences of being so involved in activities that nothing else
seemed to matter. My curiosity led me to themes I had first encountered when I was a
“short lived” student of anthropology, namely games and their cultural significance. I also
found out about Flow theory and saw how well it described the play phenomena.
I have come to see the story of the Wright Brothers’ invention of the airplane as a good
                                              18
                Foundations of Learning and Instructional Design Technology
metaphor for understanding the proper relationship between theory, research, and practice
in our field (it is even a good metaphor if you live in a country that takes issue with them
getting credit for being declared the first to invent the airplane!). That the Wright Brothers
were technologists, inventors, and tinkerers is not questioned, but people do not realize that
they were also scientists who asked the right questions about the theory of the day and
crafted ingenious experiments to get at the answers. Most of all, people forget that they
were also the world’s most experienced pilots at the time. They took their findings into the
field and practiced what they studied. These experiences likewise informed their scientific
side of the enterprise, culminating in a controllable aircraft. (Incidentally, it’s the
“controllable” part of the invention that is the real genius of the brothers.)
Many issues on this topic remain, such as the proper role of research. (I see at least two, by
the way. One is the traditional role of research contributing to the literature. A second role
for research, though less recognized, is how it informs the researcher. The act of doing
research becomes a source of ideas and invention, leading to a much deeper conceptual
understanding of the topic or problem being studied. Even if the research itself goes wrong
in some way, the researcher grows intellectually and emotionally from the experience. I’m
not sure how to characterize this research purpose since it does not fit any traditional
category (e.g. basic, applied, etc.), so perhaps we should just call it “constructive or
reflective research.”) Another topic worth pursuing is the way universities assess student
achievement. This is not an indictment of testing per se, but I admit I find it strange that we
still assign letter grades in most of our graduate courses. (That one should bring in the
mail.)
                                               19
                Foundations of Learning and Instructional Design Technology
Closing
So, what is the proper way to become an instructional technologist? Obviously, my position
is that there is not one way and that we should value the diversity of the people who make
up our profession. I also challenge each faculty member and student to stand back from
their graduate curricula and question the purpose and relevance of the experiences that are
contained there. However, this is all too easy, so I end by offering two lists. The first is one I
posted on ITFORUM awhile back. It’s my way of “reverse engineering” what I do in
language that people outside the field can understand (such as my parents):
Finally, here is a list of things I feel one needs to do to become, and remain, an instructional
technologist and represents, I hope, the best of what we are doing in our graduate
programs:
I hope you have been able to follow this roughly written essay. Here are a few questions,
                                                20
               Foundations of Learning and Instructional Design Technology
offered with the hope that a few of you will consider posting your thoughts to the list:
   1. What is your story about how you came to be an instructional technologist? What is
      unique about it? I am especially curious about individuals who do not hold graduate
      degrees in instructional technology.
   2. How well prepared were you to face the problems you now encounter in your jobs?
   3. Those of you who have a formal degree in IT, how satisfied are you as to how well you
      were prepared to do the job you now have? How well aligned were issues surrounding
      theory, research, and practice? I know that many non-American programs are not so
      reliant on course-driven models (and this is part of our redesign), so I am anxious to
      hear more about them.
   4. What would you add to my list of things that characterize what Instructional
      Technologists actually do, as a profession?
   5. What would you add to my list of what one needs to do to adequately prepare to
      become an Instructional Technologist?
Acknowledgements
I’d like to give Steve Tripp and Ron Zellner some credit for some of the ideas here, since
some were derived from some long, enjoyable (and independent) conversations with them
over the years.
Application Exercises
          Reflect on your experiences and how they have brought you to the field of
          instructional design. How are they similar to the paths described in this
          chapter and how do they provide you with a unique perspective on
          instructional design?
          Based on your individual goals, and what you understand of the field today,
          create your own list (see Rieber’s in the “Closing” section) outlining how you
          envision your role as an Instructional Technologist/Instructional Designer.
References
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper
& Row.
Feynman, R., Leighton, R., & Sands, M. (1963). The Feynman Lectures on Physics. Reading,
MA: Addison-Wesley.
                                              21
               Foundations of Learning and Instructional Design Technology
Rieber, L. P., Luke, N., & Smith, J. (1998). Project KID DESIGNER: Constructivism at work
through play, : Meridian: Middle School Computer Technology Journal [On-line], 1(1).
Available http://www.ncsu.edu/meridian/index.html
Rieber, L. P., Smith, L., & Noah, D. (in press). The value of play. Educational Technology.
Streibel, M. (1991). Instructional plans and situated learning: The challenge of Suchman’s
theory of situated action for instructional designers and instructional systems. In Gary
Anglin (Ed.),Instructional technology: Past, Present, and Future (pp. 122). Englewood, CO:
Libraries Unlimited.
von Glasersfeld, E. (1993). Questions and answers about radical constructivism. In K. Tobin
(Ed.), The practice of constructivism in science education, (pp. 23–38). Washington, DC:
AAAS Press.
Suggested Citation
                                              22
       Foundations of Learning and Instructional Design Technology
                                    23
                             Lloyd Rieber
Lloyd Rieber is originally from Pittsburgh Pennsylvania where he was born and
raised. He is now a Professor of Learning, Design, and Technology at the
University of Georgia. He was once a public school teacher in New Mexico and
in 1987 earned his Ph.D. at Penn State. Before going to the University of
Georgia, Lloyd spent six years on the faculty at Texas A&M. Lloyd’s research
interests include simulations, games, accessibility, microworlds and
visualization.
                                      24
                                                  2
Ellen Wagner
Editor's Note
    The following is an excerpt from Ellen Wagner’s article entitled “In Search of the
    Secret Handshakes of Instructional Design,” published in the Journal of Applied
    Instructional Design [http://www.jaidpub.org/]. The title for this chapter comes
    from a portion of Wagner’s essay to better represent the portion of her article that
    is republished here.
Practitioners and scholars working in the professions clustered near the intersection of
learning and technology have struggled to clearly and precisely define our practice for a
long time—almost as long as technologies have been used to facilitate the creation,
production, distribution, delivery and management of education and training experiences.
One of my favorite examples of this definitional challenge was described in a recent blog
post by Cammy Bean, vice-president of learning for Kineo, a multinational elearning
production company:
                                                  25
               Foundations of Learning and Instructional Design Technology
     You’re at a playground and you start talking to the mom sitting on the bench
     next to you. Eventually, she asks you what you do for work. What do you say?
     Are you met with comprehension or blank stares? This was me yesterday:
     I see that she really doesn’t see and I just don’t have the energy to go further.
     I’m sort of distracted by the naked boy who just ran by (not mine). We move on.
AECT has actively supported work on the definitions of big overarching constructs that offer
people working at the intersections of learning and technology with a sense of identity,
purpose and direction. Lowenthal and Wilson (2007) have noted that AECT has offered
definitions in 1963, 1972, 1977, 1994, and 2008 to serve as a conceptual foundation for
theory and practice guiding “The Field.” But they wryly observe that our definitional
boundaries can be a bit fluid. For example, after years of describing what we do as
“educational technology,” Seels and Richey (1994) made a case for using the term
“instructional technology” as the foundational, definitional descriptor. Januszewski and
Molenda (2008) returned us to the term “educational technology” as being broader and
more inclusive. All seemed to agree that the terms educational technology and instructional
technology are often used interchangeably. In discussing these implications for academic
programs, Persichitte (2008) suggested that labels—at least the label of educational
technology or instructional technology—do not seem to matter very much. And yet, I
wonder—without precision—do we not contribute to the confusion about what it is that
people like us actually do?
And what about this thing we do called instructional design? That seems to be an even
harder domain to adequately define and describe. A definition of instructional design offered
by the University of Michigan (Berger and Kaw, 1996) named instructional design as one of
two components (the other being instructional development) that together constitute the
domain of instructional technology. Instructional design was then further described in the
                                             26
               Foundations of Learning and Instructional Design Technology
Instructional Design as Reality: Instructional design can start at any point in the design
process. Often a glimmer of an idea is developed to give the core of an instruction situation.
By the time the entire process is done the designer looks back and she or he checks to see
that all parts of the “science” have been taken into account. Then the entire process is
written up as if it occurred in a systematic fashion. https://edtechbooks.org/-Lj
Ten years later, Reiser & Dempsey (2007) defined instructional design as a “systematic
process that is employed to develop education and training programs in a consistent and
reliable fashion” (pg. 11). They noted that instructional technology is creative and active, a
system of interrelated elements that depend on one another to be most effective. They
suggested that instructional design is dynamic and cybernetic, meaning that the elements
can be changed and communicate or work together easily. They posited that characteristics
of interdependent, synergistic, dynamic, and cybernetic are needed in order to have an
effective instructional design process. In their view, instructional design is centered on the
learned, is oriented on a central goal, includes meaningful performance, includes a
measurable outcome, is self-correcting and empirical, and is a collaborative effort. They
concluded that instructional design includes the steps of analysis, design, development,
implementation, and evaluation of the instructional design.
                                              27
               Foundations of Learning and Instructional Design Technology
Application Exercises
Suggested Citation
                                             28
                            Ellen Wagner
Dr. Ellen Wagner is a co-founder and chief research officer at PAR Framework.
She also serves as vice president of research at Hobsons. In the past, Dr.
Wagner has been a partner at Sage Roads Solutions, senior director of
worldwide e-learning at Adobe, and senior director of worldwide education
solutions for Macromedia. Prior to her work in the private sector, Dr. Wagner
earned her Ph.D. in learning psychology from the University of Colorado Boulder
and taught at the University of Northern Colorado.
                                      29
                                             3
History of LIDT
Editor’s Note
Today, the field is fascinated with the instructional possibilities presented by the computer
as a medium of communication and as a tool for integrating a variety of media into a single
piece of instruction. Video has replaced the educational film, and television can be two-way
and interactive.
At the turn of this century a number of technological inventions and developments were
made that provided new, and in some cases, more efficient means of communication. In the
1920s, the motion picture passed through the stage of being a mere curiosity to a serious
medium of expression, paralleling live theater. Its usefulness and influence on learning was
explored. This educational research continued into the 1930s, when new instructional
projects such as teaching by radio were implemented. Within 20 years both film and radio
became pervasive communication systems, providing both entertainment and information to
the average citizen.
The advent of World War II created many demands for a new skilled workforce. Media took
a prominent place in educational and training systems attempting to fill such needs, and
much research centered on the use of these media in a wide variety of teaching and learning
situations. Media were among the innovations that made possible the changes and growth in
the industrial complex that were so essential to the defense of the western world.
After the war, schools and industry alike attempted to settle back into the old, familiar
methods of operation. Within a few years, however, the increase in the birth rate and public
school enrollment forced a re-evaluation of the older and slower approaches to education.
Again, media were employed, this time to upgrade the curriculum of the public schools.
                                             30
               Foundations of Learning and Instructional Design Technology
With the late 1940s and early 1950s came considerable experimentation with television as
an instructional tool. Industry was expanding and began to develop its own in-house
educational systems. Simultaneously, a search was begun for more efficient and effective
means by which such education could be accomplished.
Concurrent with the introduction and development of the study of instructional media, the
notion of a science of instruction was evolving. The educational psychologists provided a
theoretical foundation which focused on those variables which influenced learning and
instruction. The nature of the learner and the learning process took precedence over the
nature of the delivery media.
Some of the early audiovisual professionals referred to the work of Watson, Thorndike,
Guthrie, Tolman, and Hull. But it was not until the appearance of Skinner’s (1954) work
with teaching machines and programmed learning that professionals in the field felt that
they had a psychological base. Skinner’s work in behavioral psychology, popularized by
Mager (1961), brought a new and apparently more respectable rationale for the field.
Lumsdaine (1964) illustrated the relationship of behavioral psychology to the field, and
Wiman and Meierhenry (1969) edited the first major work that summarized the relationship
of learning psychology to the emerging field of instructional technology. Bruner (1966)
offered new insights that eventually led to broader participation of cognitive psychologists
like Glaser (1965) and Gagné (1985). Today, the field not only seems convinced of the
importance of the various aspects of cognitive processing of information, but is placing new
emphasis upon the role of instructional context, and the unique perceptions and views of the
individual learner.
Perhaps one of the most profound changes in instructional technology has come in the
expansion of the arenas in which it is typically practiced. From its beginnings in elementary
and secondary education, the field was later heavily influenced by military training, adult
education, post-secondary education, and much of today’s activity is in the area of private
sector employee training. Consequently, there is increased concentration on issues such as
organizational change, performance improvement, school reform, and cost benefits.
However, the disparate contexts also highlight a wide range of organizational, cultural, and
personal values and attitudes. Cultures vary among the different communities, creating new
                                              31
               Foundations of Learning and Instructional Design Technology
issues and possibilities for new avenues of disciplinary growth and development.
The historical context which has surrounded the development of the field has implications
that reach beyond the actual events themselves. This is equally true of the development of
modern technology responsible for an increasing number of new media and new uses for
existing media. Such developments have redirected the energies of many people, causing
today’s society to be much broader and richer than was ever contemplated in the early
1900s.
Prior to the twentieth century, the only formal means of widespread communication was the
printing press. The technological developments since then have provided many different
modes of expression, enabling ideas, concepts, and information gained from experience to
be conveyed in ways and with contextual richness never before possible.
The unique means of expression that have expanded with each new medium have added
new dimensions through which creative talents can be applied. For example, the
photographic and cinematographic media have long been accepted as legitimate avenues for
creative work in the arts, and television has provided new avenues for expanding views of
society.
Still photography, motion picture photography, television, and the computer have proved to
be excellent tools for a variety of academic endeavors. Historians consider film coverage of
public events to be important primary documentation. Psychologists now use film,
computers, and interactive video to control experiences and to collect data on a wide variety
of problems in human behavior. Medical researchers employ both color photography and
color television in their studies. In fact, it would be difficult for modern scholars to maintain
a position of leadership in their fields of investigation without the assistance from media
that present day technology makes possible. Further, the future of humanity’s
understanding of the universe and the pursuit of greater self knowledge depends upon
increasingly sophisticated applications and utilizations of these technologies.
Alternative modes for teaching and learning are most important in today’s educational
environment. Opportunities for self-directed learning should be provided by institutions of
higher education. Other forms of alternative teaching and learning patterns which require
increased student involvement and higher levels of learning (application, synthesis,
evaluation) also rely upon media as an invaluable tool in the preparation of students.
Teaching and communication, though not synonymous, are related. Much of what the
teacher does involves communication. From the spoken word to the viewing of the real
world, directly or by means of some technological invention, communication permeates
instructional activities.
Media, materials, and interactive technologies, though not the exclusive ingredients in
learning, are an integral part of almost every learning experience. The raw materials for
scholarship increasingly reside in these means. The scholarly experiences for the student
                                               32
               Foundations of Learning and Instructional Design Technology
can often be afforded only through these options. The young scholar, the college student, is
a deprived scholar without access to these learning tools.
The scholar must have available all that modern technology can provide. Media, materials,
and interactive technologies have a crucial role to play in any teacher education program if
that program hopes to meet the needs of our dynamic, sophisticated world.
Application Exercise
    Think about the technology you are surrounded by every day (e.g. smartphones,
    tablets, digital assistants, wearable technology, VR/AR, etc.). Discuss how one or
    two of these technologies can be used in the field of instructional design or how
    they could have a future impact in the field.
Suggested Citation
                                             33
Foundations of Learning and Instructional Design Technology
                            34
 Association for Educational Communications &
                   Technology
From http://aect.org:
"AECT has become a major organization for those actively involved in the design
of instruction and the development of a systematic approach to learning. It
provides an international forum for the dissemination and exchange of ideas
among its members and target audiences; it is the national and international
advocate for the improvement of instruction; and it is the most widely
recognized source of information concerning a wide range of instructional and
educational technology. AECT and its members have numerous state and
international affiliates, all of which are passionate about finding better ways to
help people learn. AECT is the oldest professional home for this topic and
continues to maintain a central position in the field, promoting high standards of
scholarship and practice. AECT has 10 divisions and a Graduate Student
Assembly that span the breadth and depth of the field. The association produces
two print bimonthly journals, Educational Technology Research and
Development and TechTrends, and three electronic journals, Journal of
Formative Design in Learning, The Journal of Applied Instructional Design,
and International Journal of Designs for Learning."
                                       35
                                              4
Victor Lee
It is inevitable that someone studying learning and instructional design and technology
(LIDT) will come across the term Learning Sciences. Yet, for many, that moniker is
fundamentally ambiguous and misunderstood, and questions abound about this thing called
Learning Sciences. Are there multiple learning sciences or is there one dedicated and
official field referred to with the plural of Learning Sciences? Is one supposed to capitalize
both words when writing about it? Is it essentially classic educational psychology with a new
name? Does it involve things beyond the mental phenomenon of learning? Is it actually a
science? Are there points of convergence, divergence, or redundant overlap with other
fields, including those that would be seen in the field of instructional design and technology?
Are those who call themselves learning scientists best seen as friends, rivals, or innocuous
others to those who consider themselves instructional designers? There are so many
questions. There are also many answers. And a lack of a one-to-one correspondence
between questions and answers has persisted in the roughly 30 years (see Figure 1) since
the term began to see heavy use (assuming we are concerned with the capitalized L and
capitalized S version, which will be the default for this chapter).
   Figure 1. Use of the term Learning Sciences as depicted in Google’s Ngram viewer. A
                major continuous increase appears to occur around 1990.
No article, book, nor chapter has been written that gives authoritative and definitive
                                              36
               Foundations of Learning and Instructional Design Technology
answers to these questions. The current chapter is no exception. Others have made
noteworthy efforts, including contributors to a special issue of Educational Technology
(Hoadley, 2004; Kolodner, 2004), those who have edited handbooks of the Learning
Sciences (Fischer, Hmelo-Silver, Goldman, & Reimann, in press; Sawyer, 2006), and those
who have prepared edited volumes that gather and publish firsthand reports from a number
of seminal learning scientists (Evans, Packer, & Sawyer, 2016). In a sense, all of the above
are snapshots of a still-unfolding history, and I recommend them all for the interested
reader. This chapter exists as an effort to crudely present Learning Sciences to the LIDT
community as it exists at this point in time from one point of view. The current point of view
is presumably legitimized because the author of this chapter has the words Learning
Sciences on his diploma and serves professionally for Learning Sciences conferences,
journals, and academic societies. As the author, I do lead with the caveat that some of what
I have to say here is an approximation and inherently incomplete. However, I present the
following with confidence that it helps one make some progress on understanding what this
thing is called Learning Sciences.
If Figure 1 is any indication, the recent history of Learning Sciences goes back about 30
years, and it can be traced to some important locations and events[2] [#footnote-796-2]:
namely, the first International Conference of the Learning Sciences (ICLS), which took place
in 1991 and was connected to the Artificial Intelligence in Education (AIED) community. No
formal society nor publication venue for Learning Sciences existed at that time. The first
ICLS was hosted in Evanston, Illinois, in the United States, home of what was then the
Institute for the Learning Sciences and the first degree program in Learning Sciences, at
                                              37
               Foundations of Learning and Instructional Design Technology
Northwestern University. The year 1991 was also when the first issue of the Journal of the
Learning Sciences was published.
The connection to the AIED community is central to the historic identity of Learning
Sciences. In the 1980s, cognitive science had emerged as an interdisciplinary field that,
along with segments of computer science, was concerned with the workings of the human
mind. The so-called “cognitive revolution” led to interdisciplinary work among researchers
to build new models of human knowledge. The models would enable advances in the
development of artificial intelligence technologies, meaning that problem solving, text
comprehension, and natural language processing figured prominently. The concern in the
artificial intelligence community was on the workings of the human mind, not immediately
on issues of training or education. The deep theoretical commitments were to knowledge
representations (rather than to human behaviors) and how computers could be used to
model knowledge and cognitive processes.
Of course, as work in the years leading up to the first ICLS progressed in how to model and
talk about (human) cognition, many had also become interested in using these new
understandings to support learning and training. Intelligent tutoring systems gained
prominence and became an important strand of work in Learning Sciences. That work
continues to this day, with much of the work having ties historically to institutions like
Carnegie Mellon University and the University of Pittsburgh. These tutoring systems were
informed by research on expertise and expert-novice differences along with studies of self-
explanation, worked examples, and human tutoring. Many of those who did original work in
those areas still remain in Pittsburgh, but their students, colleagues, postdoctoral fellows,
and others have since established their own careers in other institutions.
                                             38
               Foundations of Learning and Instructional Design Technology
Marvin Minsky
Papert was not the only one interested in how people learned to do computer
programming[3] [#footnote-796-3]. Relatedly, programming was a concern for the Pittsburgh
tutoring systems and also for others involved in the field, such as Elliot Soloway, who was
initially at Yale before later relocating to University of Michigan. Others influential in the
                                             39
               Foundations of Learning and Instructional Design Technology
field were asking questions about what cognitive benefits result from learning to program.
One such person was Roy Pea, who had been doing work in new educational technology and
media with Jan Hawkins at the Bank Street College in New York. In Cambridge, educational
technology endeavors informed by recent cognitive science were being pursued at places
like Bolt, Beranek, and Newman (BBN) by the likes of John Seely Brown and Allan Collins,
among other talented social scientists and technologists. These early scholars represented a
part of the new educational media and computer programming sphere of research and
development.
Text comprehension was another important area of initial research in artificial intelligence,
with research on text and reading taking pace in numerous places, including Yale,
University of Illinois, and Vanderbilt to name a few. There are numerous scholars of major
influence who were involved at these different institutions, and any effort on my part to
name them all would certainly fail to be exhaustive. A few to note, however, include Roger
Schank, who relocated from Yale to Northwestern University, established the Institute for
the Learning Sciences, and amassed faculty who would subsequently establish what has
become the oldest academic program in the field; Janet Kolodner, who studied case-based
reasoning in AI text-comprehension systems at Yale, proceeded to move on to a successful
professorship at Georgia Tech, and was founding editor of the field’s first journal; John
Bransford at Vanderbilt University; and Ann Brown at University of Illinois, who then moved
with her husband, Joseph Campione, to University of California, Berkeley. Schank and
Bransford, with their respective teams at their institutions, were developing new ways to
integrate narrative story structures into technology-enhanced learning environments based
on the discoveries that were being made in text-comprehension and related cognition
research. Brown, with her student Annemarie Palincsar (who moved on to University of
Michigan), worked on extending seminal work on reciprocal teaching (Palincsar & Brown,
1984) to support improvement in text comprehension in actual real-world classroom
contexts. The desire to use the new tools and techniques that were being developed from
this cognitive research in actual learning settings rather than laboratories had been growing
at all the aforementioned locations and led to the development of a methodological staple in
Learning Sciences research: design-based research (Brown, 1992; Collins, 1992), to be
elaborated upon more below.
Thus far, what one should be able to see from this gloss of Learning Sciences history is the
major areas of research. For instance, cognitive science and artificial intelligence figured
prominently. Understanding how to best model knowledge and understanding in complex
domains continued to be a major strand of research. New technological media and a focus
on children expressing and exploring new ideas through computer programming played
prominently. There were also inclinations to look at story structure as it related to human
memory in order to improve the design of tools and technologies for learning. Finally, there
was a desire to take all these discoveries and findings and try to get them to work in actual
learning settings rather than laboratories. These were not unified positions but rather all
core areas of research and interest in the group that was coming together to establish the
field of Learning Sciences. With that list in mind, and knowing that academic conference
                                             40
               Foundations of Learning and Instructional Design Technology
keynote lectures are usually given to high-profile or aspirational figures in the field, we have
some context for the following list of invited keynote addresses at the first ICLS in 1991.
In that list, we can see the Vanderbilt group represented along with Collins and Soloway.
Andrea diSessa, a prominent and frequently cited scholar in Learning Sciences (Lee, Yuan,
Ye, & Recker, 2016) and in other fields, had completed his PhD at MIT in physics and
worked closely with Seymour Papert. diSessa’s areas of research included students learning
to program and how physics is learned. His academic career is largely associated with the
institution where he spent most of his time as a professor: the University of California,
Berkeley. Other important scholars at this point were Greeno and Scardamalia, who will be
covered in the sections below.
Cognitive science and artificial intelligence were major influences in Learning Sciences, but
contemporary work in the field is not exclusively intelligent tutoring systems, research on
students’ mental models, or how people learn to program or use new digital media. A major,
if not primary, strand of Learning Sciences research is based on a sociocultural perspective
on learning. At times, this maintains an ongoing tension with the cognitive- and AI-oriented
perspectives, and active dialogue continues (diSessa, Levin, & Brown, 2016).
John Seely Brown, mentioned previously as being a key figure in the New England area, was
later brought to the West Coast to work for Xerox PARC (Palo Alto Research Center) and
head the new Institute for Research on Learning (IRL). Part of the activities of the IRL team
at PARC involved studying how to support learning, including in the photocopying business
(Brown & Duguid, 1991). Importantly, the Bay Area location positioned PARC near the
University of California, Berkeley, where scholars like Alan Schoenfeld, Peter Pirolli, Marcia
Linn, Ann Brown, Andrea diSessa, and James Greeno had all been hired into a new program
focusing on education in mathematics, science, and technology.
Of great importance was the presence of Jean Lave, who was also on the faculty at Berkeley.
Lave, an anthropologist by training, had studied how mathematics was done in everyday life,
discovering that what mathematics looked like in practice was very different from how
mathematics understanding was conceptualized by the cognitive psychologists (e.g., Lave,
Murtaugh, & de la Rocha, 1984). Additionally, Lave and Wenger published a seminal
monograph, Situated Learning (1991), summarizing several cases of learning as it took
place in actual communities of practice. The learning involved much more than knowledge
                                              41
               Foundations of Learning and Instructional Design Technology
acquisition and instead was better modeled as changes from peripheral to central
participation in a community. Adequately encapsulating the extensive work of Lave,
Wenger, and colleagues is well beyond what can be done in a chapter. However, they
earned the attention of Greeno (Greeno & Nokes-Malach, 2016) and others by suggesting
that entirely different units of analysis were necessary for people to study learning. These
perspectives were largely cultural and social in nature, taking talk and interaction and
material artifacts as they were taken up in practice as critical. At the time, there were also
groundbreaking works published, such as the translation of Lev Vygotsky’s work (1978),
Barbara Rogoff’s studies of real-world apprenticeship (Rogoff, 1990), and Edwin Hutchins’s
bold proposal that AI approaches to cognitive science were being far too restrictive in
recognizing and understanding cognition as it happened “in the wild” (Hutchins, 1995).
These ideas had a great deal of influence on the emerging community of learning scientists,
and the close proximity of the scholars and their ideas led to major public debates about
how learning could best be understood (Anderson, Reder, & Simon, 1996; Greeno, 1997).
The establishment and acceptance of cultural-historical activity theory and the work of
Michael Cole (an institutional colleague of Hutchins) and Yrjo Engestrom also figured
prominently as CHAT found a place in education and other scholarly communities. Also
influential was James Wertsch, an anthropologically oriented, cultural historical educational
scholar.
Much of contemporary Learning Sciences research has extended these ideas. Rather than
focusing on knowledge, many learning scientists focus on social practices, whether they be
scientific or mathematical practices, classroom practices, or informal practices. Identity as a
socially constructed and continually mediated construct has become a major concern.
Seeking continuities between cultures (with cultures not necessarily being geographical nor
ethnic in nature) and discovering how to design activities, tools, or routines that are taken
up by a culture or give greater understanding of how cultures operate remain ongoing
quests. Other concerns include historicity, marginalization of communities, cultural assets
rather than cultural deficits, equity, social justice, and social and material influences on
spaces that are intended to support learning.
Helping people learn and using new technologies remain important themes, but rather than
focusing on computers solely as tutoring systems or spaces where simulations of complex
phenomena can be run, current learning sciences technologies with a sociocultural bent
allow for youth to collect data about their cities and critically examine equity and
opportunity; to become community documentarians and journalists so that local history is
                                              42
               Foundations of Learning and Instructional Design Technology
valued and conserved in line with the individual interests of participating youth; to build
custom technologies of students’ own design that better the circumstances of their peers,
homes, and communities; and to obtain records of everyday family or museum or after-
school activities that have embedded within them germs of rich literary, mathematical,
historical, or scientific thought. Current technologies also act as data- and knowledge-
sharing tools that help make invisible practices and routines in schools more visible to
teachers and other educators.
In the early days of Learning Sciences, cognitive and sociocultural perspectives figured
prominently, in addition to the opportunity to look at and modify intact educational systems
rather than relegating research to strictly the laboratory. The relationships being built and
dialogues taking place were critically important, as was the proximity of research centers to
universities that were establishing associated degree programs. However, according to
Stahl (2016), some distance grew after the first ICLS conference. Some of this distance was
geographic, but it also had a great deal to do with what got spotlighted as internally
sanctioned Learning Sciences research. The community that participated in the first ICLS
that began to feel a rift was the Computer Supported Collaborative Learning (CSCL)
community. Many, but not all, scholars in this area were located in Europe.
CSCL, like the rest of the Learning Sciences community, was also seriously interested in
cognition, new technologies, and social contexts of learning. However, if there were some
distinguishing features of the CSCL community, the focus on technology-mediated group
cognition figured prominently. Several topics were important for looking at how people
learned together online in designed spaces. Examining conceptual change as it became a
reciprocal and negotiated process between multiple parties using a technology was also part
of this group emphasis. Scripting that informed implicit expectations for how students
would interact and move through collaborative learning activities became a major focus.
Online knowledge building environments with asynchronous participation and online
discourse were also a big focus of CSCL. Ideas about collaborative learning from Naomi
Miyake (Chukyo University, then University of Tokyo, Japan), Jeremy Roschelle (SRI
International, USA), Stephanie Teasley (SRI International, now at University of Michigan,
USA), Claire O’Malley (University of Nottingham, UK), Frank Fischer (Ludwig-Maximilian
University of Munich, Germany), Pierre Dillenbourg (University of Geneva and later at École
Polytechnique Fédérale de Lausanne, Switzerland), Paul Kirschner (Open University,
Netherlands), Gerry Stahl (Drexel University, USA), Marlene Scardamalia and Carl Bereiter
(Ontario Institute for Studies in Education, Canada), and Timothy Koschmann (Southern
Illinois University, USA) were formative.[4] [#footnote-796-4] Sometimes classrooms were
the focus, but other learning settings, such as surgical rooms or online forums, became
important research sites as well.
CSCL became a distinct enough strand of research that its own workshop was held in 1992
and then its own conference in 1995. Analyses of networks of collaboration and conference
                                             43
               Foundations of Learning and Instructional Design Technology
topics appear in Kienle and Wessner (2006). There were scholars who consistently appeared
at both ICLS and CSCL conferences. Activity in one conference was in no way mutually
exclusive from activity in the other. However, there were eventually contingents that were
more drawn to one community over the other. Ultimately, given deep overlaps and crossover
between CSCL and ICLS, a formal society that oversaw both conference series, the
International Society of the Learning Sciences (ISLS), was established in 2002. Many of the
aforementioned CSCL scholars were elected president of that society as the years
proceeded, and many early graduate students who participated in the formation of these
communities and the Learning Sciences field, who went on to become established scholars
themselves, were elected as well. In 2006, the International Journal of Computer-Supported
Collaborative Learning was established as a leading publication venue, with Gerry Stahl as
founding editor. This was officially sponsored by the ISLS, as was the society’s other
flagship journal that had been operating since 1991, Journal of the Learning Sciences, with
Janet Kolodner as the founding editor.
          Professional Organizations
               International Society of the Learning Sciences
               American Educational Research Association SIGs Learning Sciences
               and Advanced Technologies for Learning
          Conference Venues
               International Conference of the Learning Sciences
               Computer-Supported Collaborative Learning
          Academic Journals
               Journal of the Learning Sciences
               International Journal of Computer-Supported Collaborative Learning
          Academic Programs and Online Resources
               Network of Academic Programs in the Learning Sciences (NAPLeS)
Design-based Research
                                              44
               Foundations of Learning and Instructional Design Technology
The nature of design-based research has been described in many places elsewhere (Cobb,
Confrey, diSessa, Lehrer, & Schauble, 2003; The Design-Based Research Collective, 2003;
Sandoval & Bell, 2004), and new innovations to support that paradigm have been developed
in the over two and a half decades since it was first introduced in academic publication (e.g.,
Sandoval, 2013). The simplest articulation of design-based research is that it involves
researchers working with real educational settings and introducing new tools, practices, or
activities that embody a set of assumptions that exist based on prior research.
For example, one might know from the existing literature that metacognitive support can
improve learning outcomes during laboratory text-comprehension tasks. Rather than accept
that as a given and hope that this finding gets translated on its own into classroom practice,
the aspiring design-based researcher may then design and develop a new software tool that
helps students continually monitor their own understanding and reflect on their own
progress when reading science texts at school. The researcher would then test it informally
to make sure it is usable and make arrangements with a local school to have some of their
English classes use it. Upon bringing it into a school classroom, they discover that the
metacognitive supports are actually confusing and counterproductive in the classroom
because so much depends on whether students find the topic engaging and whether the
teacher can orchestrate a classroom activity to split instructional time such that students
begin by using the tool, participate in a reflective discussion with the teacher, and then
return to the tool. The design-based researcher may discover that, unlike the 15-minute
sessions reported in the existing literature when metacognitive training was done in the lab,
a week is actually required to smoothly implement the tool in the classroom. The teachers
need some help noticing what student comments to build upon in the reflection discussions.
Texts need to be modified to immediately connect more to topics students already know.
In this experience, a well-meaning researcher attempted to take the best of what was known
from prior research and ended up taking participants on a much more complicated journey
than intended. That journey began to reveal how metacognitive activity works in a real
education setting, how software tools should be designed and used in school settings, and
what sorts of things classroom teachers need to do with the software to make it maximally
effective. To verify that these new discoveries are actually valid ones, the researcher
implements some revisions and sees if the expected outcomes emerge. If not, the design-
based researcher repeats, or reiterates, the design work with that classroom.
That cycle is a very general summary of how design-based research unfolds. The researcher
may have varying levels of involvement in the educational setting, where they may provide
some initial professional development or training to a facilitator and then watch what
unfolds later or where they may directly lead the classroom activities by their self. Design-
based research can be a solo endeavor or a major team one. The benefit of this type of
research is that it puts theoretical assertions (e.g., metacognitive supports improve text
comprehension) in harm’s way by allowing for the complexity of the real world to be
introduced. This helps to refine (or even establish) stronger theory that speaks to
complexities of how learning works in different systems. The intact unit could be a single
                                              45
               Foundations of Learning and Instructional Design Technology
student, a single classroom, a group of teachers, multiple classrooms, multiple grade bands
in a school, a museum exhibit, a museum floor, an after-school program, a university course,
or an online course. The outcomes of design-based research are articulated especially nicely
by Edelson (2002), who argues that design-based research ultimately produces new
knowledge about domain theories, design frameworks, and design methodologies. diSessa
and Cobb (2004) have also suggested that design-based research can be the locus for new
theoretical constructs to emerge.
As design-based research has matured, some have pushed to broaden its scope to speak to
larger educational systems. Rather than working with individual students or classrooms,
design-based implementation research (DBIR) promotes partnership with educational
institutions such as entire schools or school districts (Penuel, Fishman, Cheng, & Sabelli,
2014). Related design-based approaches also appear as improvement science (Lewis, 2015)
and in research-practice partnerships (Coburn & Penuel, 2016). As of late, these have been
receiving more attention. Optimistically, we could see this as the desire of funding agencies
and academic communities to scale important findings from the past decades of design-
based research and to understand what enables new and powerful tools and activities to
support learning and impact more learners.
                                              46
               Foundations of Learning and Instructional Design Technology
exist that discourage such cross talk. In some cases, strong academic departments have
split because faculty in them felt that LIDT and Learning Sciences were incompatible.
However, there have since been deliberate efforts to close perceived rifts. For example,
Pennsylvania State University made a deliberate effort to hire individuals trained in
Learning Sciences (Chris Hoadley, Brian K. Smith) into their already strong LIDT-oriented
department, and that promoted dialogue and relationship building, although the LS-oriented
faculty composition has since changed. Utah State University hired Mimi Recker, an early
student of the Berkeley program that emerged in the 1990s and subsequently took on a
blended departmental identity (USU ITLS Faculty, 2009). Members of the University of
Georgia Learning and Performance Systems Laboratory (Daniel Hickey and Kenneth Hay)
took positions in a new Learning Sciences program established at Indiana University. The
push for more relationship building is now there.
The future of the relationship between LIDT and Learning Sciences organizations and
programs is ultimately up to those who are currently training as students in those fields. As
someone who has been operating in both spaces, although I was explicitly trained in one, I
understand many barriers are actually illusory. There are different foci and theoretical
commitments and expectations in each field, but both communities deeply care about
learning and how we can build knowledge to improve the tools, practices, and environments
that support it. To gain traction in the other field, people simply start by reserving judgment
and then reading the other field’s core literatures. They start conversations with individuals
who are connected to the other field and initiate collaborations. They get excited about
ideas that other parties are also currently thinking about, and they have dialogue. In fact,
that’s the simplified version of how Learning Sciences began. It could be the beginning of
the history for a new multidisciplinary field in the future as well.
References
Anderson, J. R., Reder, L. M., & Simon, H. A. (1996). Situated learning and education.
Educational Researcher, 25(4), 5–11.
Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in
educational research. Educational Researcher, 32(1), 9–13.
                                              47
               Foundations of Learning and Instructional Design Technology
Collins, A. (1992). Toward a design science of education. In T. O’Shea & E. Scnalon (Eds.),
New directions in educational technology (Vol. 96, pp. 15–22): Springer Verlag.
diSessa, A. A., & Cobb, P. (2004). Ontological innovation and the role of theory in design
experiments. Journal of the learning sciences, 13(1), 77–103.
diSessa, A. A., Levin, M., & Brown, N. J. S. (Eds.). (2016). Knowledge and Interaction: A
Synthetic Agenda for the Learning Sciences. New York, NY: Taylor & Francis.
Edelson, D. C. (2002). Design research: What we learn when we engage in design. Journal of
the learning sciences, 11(1), 105–121.
Evans, M. A., Packer, M. J., & Sawyer, R. K. (Eds.). (2016). Reflections on the Learning
Sciences. Cambridge University Press.
Fischer, F., Hmelo-Silver, C. E., Goldman, S. R., & Reimann, P. (Eds.). (in press).
International Handbook of the Learning Sciences. New York: Routledge.
Greeno, J. G. (1997). On claims that answer the wrong questions. Educational Researcher,
26(1), 5–17.
Greeno, J. G., & Nokes-Malach, T. J. (2016). Some Early Contributions to the Situative
Perspective on Learning and Cognition. In M. A. Evans, M. J. Packer, & R. K. Sawyer (Eds.),
Reflections on the Learning Sciences (pp. 59–75). Cambridge: Cambridge University Press.
Hoadley, C. M. (2004). Learning and design: Why the learning sciences and instructional
systems need each other. Educational Technology, 44(3), 6–12.
USU ITLS Faculty. (2009). What’s in a name? An identity shift at Utah State University.
Educational Technology, 49(4), 38–41.
Kirby, J. A., Hoadley, C. M., & Carr-Chellman, A. A. (2005). Instructional systems design and
the learning sciences: A citation analysis. Educational Technology Research and
Development, 53(1), 37–47. doi:10.1007/BF02504856
Kienle, A., & Wessner, M. (2006). The CSCL community in its first decade: Development,
continuity, connectivity. International Journal of Computer-Supported Collaborative
Learning, 1(1), 9–33.
                                              48
               Foundations of Learning and Instructional Design Technology
Kolodner, J. L. (2004). The learning sciences: Past, present, and future. Educational
Technology, 44(3), 37–42.
Lave, J., Murtaugh, M., & de la Rocha, O. (1984). The dialectic of arithmetic in grocery
shopping. In B. Rogoff & J. Lave (Eds.), Everyday cognition: It’s development in social
context (pp. 67–94). Cambridge, MA: Harvard University Press.
Lave, J., & Wenger, E. (1991). Situated Learning: Legitimate peripheral participation.
Cambridge: Cambridge University Press.
Lee, V. R., Yuan, M., Ye, L., & Recker, M. (2016). Reconstructing the influences on and focus
of the Learning Sciences from the field’s published conference proceedings In M. A. Evans,
M. J. Packer, & R. K. Sawyer (Eds.), Reflections on the Learning Sciences (pp. 105–125).
New York, NY: Cambridge University Press.
Papert, S. (1980). Mindstorms : children, computers, and powerful ideas. New York, NY:
Basic Books.
Pea, R. D. (2016). The Prehistory of the Learning Sciences. In M. A. Evans, M. J. Packer, &
R. K. Sawyer (Eds.), Reflections on the Learning Sciences (pp. 32–59). Cambridge:
Cambridge University Press.
Penuel, W. R., Fishman, B. J., Cheng, B. H., & Sabelli, N. (2011). Organizing research and
development at the intersection of learning, implementation, and design. Educational
Researcher, 40(7), 331–337.
Sandoval, W. A., & Bell, P. (2004). Design-Based Research Methods for Studying Learning in
Context: Introduction. Educational Psychologist, 39(4), 199–201.
doi:10.1207/s15326985ep3904_1
Sawyer, R. K. (Ed.). (2006). The Cambridge Handbook of the Learning Sciences: Cambridge
University Press.
                                             49
               Foundations of Learning and Instructional Design Technology
(Eds.), Reflections on the Learning Sciences (pp. 19–31). Cambridge: Cambridge University
Press.
Stahl, G. (2016). The Group as Paradigmatic Unit of Analysis: The Contested Relationship of
Computer-Supported Collaborative Learning to the Learning Sciences. In M. A. Evans, M. J.
Packer, & R. K. Sawyer (Eds.), Reflections on the Learning Sciences (pp. 76–104).
Cambridge: Cambridge University Press.
Footnotes
Suggested Citation
                                            50
          Foundations of Learning and Instructional Design Technology
                                        51
                               Victor Lee
                                      52
                                             5
LIDT Timeline
Editor’s Note
    The following timeline was created by students in the Instructional Psychology and
    Technology department at Brigham Young University.
[https://edtechbooks.org/-dYS]
http://bit.ly/2yk76If
                                             53
                                              8
Martin Weller
Editor’s Note
The following was originally published by Educause with the following citation:
    Weller, M. (2018, July 2). 20 Years of EdTech. EDUCAUSE Review 53(4). Available
    at https://er.educause.edu/articles/2018/7/twenty-years-of-edtech
    [https://edtechbooks.org/-HW]
An opinion often cited among educational technology (edtech) professionals is that theirs is
a fast-changing field. This statement is sometimes used as a motivation (or veiled threat) to
senior managers to embrace edtech because if they miss out now, it’ll be too late to catch
up. However, amid this breathless attempt to keep abreast of new developments, the edtech
field is remarkably poor at recording its own history or reflecting critically on its
development. When Audrey Watters recently put out a request for recommended books on
the history of educational technology,1 [#fn1] I couldn’t come up with any beyond the handful
she already had listed. There are edtech books that often start with a historical chapter to
set the current work in context, and there are edtech books that are now part of history, but
there are very few edtech books dealing specifically with the field’s history. Maybe this
reflects a lack of interest, as there has always been something of a year-zero mentality in
the field. Edtech is also an area to which people come from other disciplines, so there is no
shared set of concepts or history. This can be liberating but also infuriating. I’m sure I was
not alone in emitting the occasional sigh when during the MOOC rush of 2012, so many
“new” discoveries about online learning were reported—discoveries that were already tired
concepts in the edtech field.
                                              54
               Foundations of Learning and Instructional Design Technology
(although comparing horror stories about metadata fields is enjoyable); it also allows us to
examine what has changed, what remains the same, and what general patterns can be
discerned from this history. Although the selection is largely a personal one, it should
resonate here and there with most practitioners in the field. I have also been rather
arbitrary in allocating a specific year: the year is not when a particular technology was
invented but, rather, when it became—in my view—significant.
Looking back twenty years starts in 1998, when the web had reached a level of mainstream
awareness. It was accessed through dial-up modems, and there was a general sense of
puzzlement about what it would mean, both for society more generally and for higher
education in particular. Some academics considered it to be a fad. One colleague dismissed
my idea of a fully online course by declaring: “No one wants to study like that.” But the
potential of the web for higher education was clear, even if the direction this would take
over the next twenty years was unpredictable.
1998: Wikis
Perhaps more than any other technology, wikis embody the spirit of optimism and
philosophy of the open web. The wiki—a web page that could be jointly edited by
anyone—was a fundamental shift in how we related to the internet. The web democratized
publishing, and the wiki made the process a collaborative, shared enterprise. In 1998 wikis
were just breaking through. Ward Cunningham is credited with inventing them (and the
term) in 1994. Wikis had their own markup language, which made them a bit technical to
use, although later implementations such as Wikispaces made the process easier. Wikis
encapsulated the promise of a dynamic, shared, respectful space—the result partly of the
ethos behind them (after all, they were named after the Hawaiian word for quick) and partly
of their technical infrastructure. Users can track edits, roll back versions, and monitor
contributions. Accountability and transparency are built in.
With Wikipedia now the default knowledge source globally with over 5.5 million articles
(counting only those in English), it would seem churlish to bemoan that wikis failed to fulfil
their potential. Nevertheless, that statement is probably true in terms of the use of wikis in
teaching. For instance, why aren’t MOOCs conducted in wikis? It’s not necessarily that wikis
as a technology have not fully realized their potential. Rather, the approach to edtech they
represent—cooperative and participatory—has been replaced by a broadcast, commercial
publisher model.
1999: E-learning
E-learning had been in use as a term for some time by 1999, but the rise of the web and the
prefix of “e” to everything saw it come to prominence. By 1999, e-learning was knocking on
the door of, if not already becoming part of, the mainstream. Conventional and distance
colleges and universities were adopting e-learning programs, often whenever the target
audience would be willing to learn this way. One of the interesting aspects of e-learning was
                                             55
               Foundations of Learning and Instructional Design Technology
the consideration of costs. The belief was that e-learning would be cheaper than traditional
distance-education courses. It wasn’t, although e-learning did result in a shift in costs:
institutions could spend less in production (by not using physical resources and by reusing
material), but there was a consequent increase in presentation costs (from support costs
and a more rapid updating cycle). This cost argument continues to reoccur and was a
significant driver for MOOCs (see year 2012).
E-learning set the framework for the next decade in terms of technology, standards, and
approaches—a period that represents, in some respects, the golden age of e-learning.
      There are thousands of colleges and universities, each of which teaches, for
      example, a course in introductory trigonometry. Each such trigonometry course
      in each of these institutions describes, for example, the sine wave function. . . .
      Now for the premise: the world does not need thousands of similar descriptions
      of sine wave functions available online. Rather, what the world needs is one, or
      maybe a dozen at most, descriptions of sine wave functions available online. The
      reasons are manifest. If some educational content, such as a description of sine
      wave functions, is available online, then it is available worldwide.2 [#fn2]
This made a lot of sense then, and it still makes a lot of sense today. A learning object was
roughly defined as “a digitized entity which can be used, reused or referenced during
technology supported learning.”3 [#fn3] But learning objects never really took off, despite the
compelling rationale for their existence. The failure to make them a reality is instructive for
all in the edtech field. They failed to achieve wide-scale adoption for a number of reasons,
including over-engineering, debates around definitions, the reusability paradox,4 [#fn4]and the
fact that they were an alien concept for many educators who were already overloaded.
Nevertheless, the core idea of learning objects would resurface in different guises.
                                              56
               Foundations of Learning and Instructional Design Technology
E-learning standards are an interesting case study in edtech. Good standards retreat into
the background and just help things work, as SCORM has done. But other standards have
failed in some of their ambitions to create easily assembled, discoverable, plug-and-play
content. So while the standards community continues to work, it has encountered problems
with vendors5 [#fn5] and has been surpassed in popular usage by the less specific but more
human description and sharing approach that underlined the web 2.0 explosion (see year
2006).
Like learning objects, the software approach (in particular, open-source software) provides
the roots for OER. The open-source movement can be seen as creating the context within
which open education could flourish, partly by analogy and partly by establishing a
precedent. But there is also a very direct link, via David Wiley, through the development of
licenses.6 [#fn6] In 1998 Wiley became interested in developing an open license for educational
content, and he directly contacted pioneers in the open-source world. Out of this came the
Open Content License (OCL), which he developed with publishers to establish the Open
Publication License (OPL) the next year.
The OPL proved to be one of the key components, along with the Free Software
Foundation’s GNU license, of the Creative Commons licenses,
[https://creativecommons.org/] developed by Larry Lessig and others in 2002. These went on
to become essential in the open-education movement. The simple licenses in Creative
Commons allowed users to easily share resources, and OER became a global movement.
Although OER have not transformed higher education in quite the way many envisaged in
2002 and many projects have floundered after funding ends, the OER idea continues to be
relevant, especially through open textbooks and open educational practice (OEP).
                                              57
               Foundations of Learning and Instructional Design Technology
The general lessons from OER are that it succeeded where learning objects failed because
OER tapped into existing practice (and open textbooks doubly so). The concept of using a
license to openly share educational content is alien enough, without all the accompanying
standards and concepts associated with learning objects. Patience is required: educational
transformation is a slow burn.
2003: Blogs
Blogging developed alongside the more education-specific developments and was then co-
opted into edtech. In so doing, it foreshadowed much of the web 2.0 developments, with
which it is often bundled.
Blogging was a very obvious extension of the web. Once people realized that anyone could
publish on the web, they inevitably started to publish diaries, journals, and regularly
updated resources. Blogging emerged from a simple version of “here’s my online journal”
when syndication became easy to implement. The advent of feeds, and particularly the
universal standard RSS, provided a means for readers to subscribe to anyone’s blog and
receive regular updates. This was as revolutionary as the liberation that web publishing
initially provided. If the web made everyone a publisher, RSS made everyone a distributor.
People swiftly moved beyond journals. After all, what area isn’t impacted by the ability to
create content freely, whenever you want, and have it immediately distributed to your
audience? Blogs and RSS-type distribution were akin to giving everyone superhero powers.
It’s not surprising that in 2018, we’re still wrestling with the implications. No other edtech
has continued to develop and solidify (as the proliferation of WordPress sites attests) and
also remain so full of potential. For almost every edtech that comes along—e-portfolios,
VLEs, MOOCs, OER, social media—I find myself thinking that a blog version would be
better. Nothing develops and anchors an online identity quite like a blog.
As e-learning became more integral to both blended-learning and fully-online courses, this
variety and reliability became a more critical issue. The LMS offered a neat collection of the
most popular tools, any one of which might not be as good as the best-of-breed specific tool
but was good enough. The LMS allowed for a single, enterprise solution with the associated
                                              58
               Foundations of Learning and Instructional Design Technology
training, technical support, and helpdesk. The advantage was that e-learning could be
implemented more quickly across an entire institution. However, over time this has come to
be seen more as a Faustian pact as institutions found themselves locked into contracts with
vendors, most famously with providers (e.g., Blackboard) that attempted to file restrictive
patents.7 [#fn7] More problematically, the LMS has become the onlyroute for delivering e-
learning in many institutions, with a consequent loss of expertise and innovation.8 [#fn8]
2005: Video
YouTube was founded in 2005, which seems surprisingly recent, so much has it become a
part of the cultural landscape. As internet access began to improve and compression
techniques along with it, the viability of streaming video had reached a realistic point for
many by 2005. YouTube and other video-sharing services flourished, and the realization that
anyone could make a video and share it easily was the next step in the broadcast
democratization that had begun with HTML. While the use of video in education was often
restricted to broadcast, this was a further development on the learning objects idea. As the
success of the Khan Academy [https://www.khanacademy.org/] illustrates, simple video
explanations of key concepts—explanations that can be shared and embedded easily—met a
great educational demand. However, colleges and universities for the most part still do not
assess students on their use of video. In some disciplines, such as the arts, this is more
common, but in 2018, text remains the dominant communication form in education.
Although courses such as DS106 have innovated in this area,9 [#fn9] many students will go
through their education without being required to produce a video as a form of assessment.
We need to fully develop the critical structures for video in order for it to fulfil its
educational potential, as we have already done for text.
Just as the fascination with e-learning had seen every possible term prefixed with “e,” so the
addition of “2.0” to any educational term made it fashionable. But soon the boom was
followed by the consequent bust (a business plan was needed after all), and problems with
some of the core concepts meant that by 2009, web 2.0 was being declared dead.12 [#fn12]
Inherent in much of the web 2.0 approach was a free service, which inevitably led to data
being the key source for revenue and gave rise to the oft-quoted line “If you’re not paying
for it, you’re the product being sold.”13 [#fn13] As web 2.0 morphed into social media, the
                                              59
               Foundations of Learning and Instructional Design Technology
inherent issues around free speech and offensive behavior came to the fore. In educational
terms, this raises issues about duty of care for students, recognizing academic labor, and
marginalized groups. The utopia of web 2.0 turned out to be one with scant regard for
employment laws and largely reserved for “tech bros.”
Nevertheless, at the time, web 2.0 posed a fundamental question as to how education
conducts many of its cherished processes. Peer review, publishing, ascribing quality—all of
these were founded on what David Weinberger referred to as filtering on the way in rather
than on the way out.14 [#fn14] While the quality of much online content was poor, there was
always an aspect of what was “good enough” for any learner. With the demise of the
optimism around web 2.0, many of the accompanying issues it raised for higher education
have largely been forgotten—before they were even addressed. For instance, while the open
repository for physics publications (arXiv [https://arxiv.org/]) and open-access methods for
publication became mainstream, the journal system is still dominant, largely based on
double-blind, anonymous peer review. Integrating into the mainstream the participatory
culture that web 2.0 brought to the fore remains both a challenge and an opportunity for
higher education.
2008: E-portfolios
Like learning objects, e-portfolios were backed by a sound idea. The e-portfolio was a place
to store all the evidence a learner gathered to exhibit learning, both formal and informal, in
order to support lifelong learning and career development. But like learning objects—and
despite academic interest and a lot of investment in technology and standards—e-portfolios
did not become the standard form of assessment as proposed. Many of their problems were
similar to those that beleaguered learning objects, including overcomplicated software, an
institutional rather than a user focus, and a lack of accompanying pedagogical change.
Although e-portfolio tools remain pertinent for many subjects, particularly vocational ones,
                                              60
               Foundations of Learning and Instructional Design Technology
for many students owning their own domain and blog remains a better route to establishing
a lifelong digital identity. It is perhaps telling that although many practitioners in higher
education maintain blogs, asking to see a colleague’s e-portfolio is likely to be met with a
blank response.
2010: Connectivism
The early enthusiasm for e-learning saw a number of pedagogies resurrected or adopted to
meet the new potential of the digital, networked context. Constructivism, problem-based
learning, and resource-based learning all saw renewed interest as educators sought to
harness the possibility of abundant content and networked learners. Yet connectivism, as
proposed by George Siemens and Stephen Downes in 2004–2005, could lay claim to being
the first internet-native learning theory. Siemens defined connectivism as “the integration of
principles explored by chaos, network, and complexity and self-organization theories.
Learning is a process that occurs within nebulous environments of shifting core
elements—not entirely under the control of the individual.”15 [#fn15] Further investigating the
possibility of networked learning led to the creation of the early MOOCs, including
influential open courses by Downes and Siemens in 2008 and 2009.16 [#fn16] Pinning down
exactly what connectivism was could be difficult, but it represented an attempt to rethink
how learning is best realized given the new realities of a digital, networked, open
environment, as opposed to forcing technology into the service of existing practices. It also
provided the basis for MOOCs, although the approach they eventually adopted was far
removed from connectivism (see 2012).
2011: PLE
Personal Learning Environments (PLEs) were an outcome of the proliferation of services
that suddenly became available following the web 2.0 boom. Learners and educators began
                                              61
               Foundations of Learning and Instructional Design Technology
to gather a set of tools to realize a number of functions. In edtech, the conversation turned
to whether these tools could be somehow “glued” together in terms of data. Instead of
talking about one LMS provided to all students, we were discussing how each learner had
his/her own particular blend of tools. Yet beyond a plethora of spoke diagrams, with each
showing a different collection of icons, the PLE concept didn’t really develop after its peak
in 2011. The problem was that passing along data was not a trivial task, and we soon
became wary about applications that shared data (although perhaps not wary enough, given
recent news regarding Cambridge Analytica17 [#fn17]). Also, providing a uniform offering and
support for learners was difficult when they were all using different tools. The focus shifted
from a personalized set of tools to a personalized set of resources, and in recent years this
has become the goal of personalization.
2012: MOOCS
Inevitably, 2012 will be seen as the year of MOOCs.18 [#fn18] In many ways the MOOC
phenomenon can be viewed as the combination of several preceding technologies: some of
the open approach of OER, the application of video, the experimentation of connectivism,
and the revolutionary hype of web 2.0. Clay Shirky mistakenly proclaimed that MOOCs were
the internet happening to education.19 [#fn19] If he’d been paying attention, he would have
seen that this had been happening for some time. Rather, MOOCs were Silicon Valley
happening to education. Once Stanford Professor Sebastian Thrun’s course had attracted
over 100,000 learners and almost as many headlines,20 [#fn20] the venture capitalist investment
flooded in.
Much has been written about MOOCs, more than I can do justice to here. They are a case
study still in the making. The raised profile of open education and online learning caused by
MOOCs may be beneficial in the long run, but the MOOC hype (only ten global providers of
higher education by 2022?)21 [#fn21] may be equally detrimental. The edtech field needs to
learn how to balance these developments. Millions of learners accessing high-quality
material online is a positive, but the rush by colleges and universities to enter into
prohibitive contracts, outsource expertise, and undermine their own staff has long-term
consequences as well.
                                              62
               Foundations of Learning and Instructional Design Technology
open textbooks at the center.22 [#fn22] However, the possible drawback is that like LMSs, open
textbooks may not become a stepping-stone on the way to a more innovative, varied
teaching approach but, rather, may become an end point in themselves.
Of these challenges, only the first relates directly to technology; the more substantial ones
relate to awareness and legitimacy. For example, if employers or institutions come to widely
accept and value digital badges, then they will gain credence with learners, creating a
virtuous circle. There is some movement in this area, particularly with regard to staff
development within organizations and often linked with MOOCs.24 [#fn24] Perhaps more
interesting is what happens when educators design for badges, breaking courses down into
smaller chunks with associated recognition, and when communities of practice give badges
value. Currently, their use is at an indeterminate stage—neither a failed enterprise nor the
mainstream adoption once envisaged.
                                               63
               Foundations of Learning and Instructional Design Technology
with the possible development of intelligent tutoring systems. The initial enthusiasm for
these systems has waned somewhat, mainly because they worked for only very limited,
tightly specified domains. A user needed to predict the types of errors people would make in
order to provide advice on how to rectify those errors. And in many subjects (the humanities
in particular), people are very creative in the errors they make, and more significantly, what
constitutes the right answer is less well defined.
Interest in AI faded as interest in the web and related technologies increased, but it has
resurfaced in the past five years or so. What has changed over this intervening period is the
power of computation. This helps address some of the complexity because multiple
possibilities and probabilities can be accommodated. Here we see a recurring theme in
edtech: nothing changes while, simultaneously, everything changes. AI has definitely
improved since the 1990s, but some of its fundamental problems remain. It always seems to
be a technology that is just about to break out of the box.
More significant than the technological issues are the ethical ones. As Audrey Watters
contends, AI is ideological.25 [#fn25] The concern about AI is not that it won’t deliver on the
promise held forth by its advocates but, rather, that someday it will. And then the
assumptions embedded in code will shape how education is realized, and if learners don’t fit
that conceptual model, they will find themselves outside of the area in which compassion
will allow a human to alter or intervene. Perhaps the greatest contribution of AI will be to
make us realize how important people truly are in the education system.
2017: Blockchain
Of all the technologies listed here, blockchain is perhaps the most perplexing, both in how it
works and in why it is even in this list. In 2016 several people independently approached me
about blockchain—the distributed, secure ledger for keeping the records that underpin
Bitcoin. The question was always the same: “Could we apply this in education somehow?”
The imperative seemed to be that blockchain was a cool technology, and therefore there
must be an educational application. It could provide a means of recording achievements and
bringing together large and small, formal and informal, outputs and recognition.26 [#fn26]
Viewed in this way, blockchain is attempting to bring together several issues and
technologies: e-portfolios, with the aim to provide an individual, portable record of
educational achievement; digital badges, with the intention to recognize informal learning;
MOOCs and OER, with the desire to offer varied informal learning opportunities; PLEs and
personalized learning, with the idea to focus more on the individual than on an institution. A
personal, secure, permanent, and portable ledger may well be the ring to bind all these
together. However, the history of these technologies should also be a warning for
blockchain enthusiasts. With e-portfolios, for instance, even when there is a clear
connection to educational practice, adoption can be slow, requiring many other components
to fall into place. In 2018 even the relatively conservative and familiar edtech of open
textbooks is far from being broadly accepted. Attempting to convince educators that a
                                              64
               Foundations of Learning and Instructional Design Technology
complex technology might solve a problem they don’t think they have is therefore unlikely to
meet with widespread support.
If blockchain is to realize any success, it will need to work almost unnoticed; it will succeed
only if people don’t know they’re using blockchain. Nevertheless, many who propose
blockchain display a definite evangelist’s zeal. They desire its adoption as an end goal in
itself, rather than as an appropriate solution to a specific problem.
2018: TBD
We’re only halfway through 2018, so it would be premature to select a technology, theory,
or concept for the year. But one aspect worth considering is what might be termed the dark
side of edtech. Given the use of social media for extremism, data scares such as the
Facebook breach by Cambridge Analytica, anxieties about Russian bots, concerted online
abuse, and increased data surveillance, the unbridled optimism that technology will create
an educational utopia now seems naïve. It is not just informed critics such as Michael
Caulfield27 [#fn27] who are warning of the dangers of overreliance on and trust in edtech; the
implicit problems are now apparent to most everyone in the field. In 2018, edtech stands on
the brink of a new era, one that has a substantial underpinning of technology but that needs
to build on the ethical, practical, and conceptual frameworks that combat the nefarious
applications of technology.
Conclusion
Obviously, one or two paragraphs cannot do justice to technologies that require several
books each, and my list has undoubtedly omitted several important developments (e.g.,
gaming, edupunk, automatic assessment, virtual reality, and Google might all be
contenders). However, from this brief overview, a number of themes can be extracted to
help inform the next twenty years.
The first of these is that in edtech, the tech part of the phrase walks taller. In my list, most
of the innovations are technologies. Sometimes these come with strong accompanying
educational frameworks, but other times they are a technology seeking an application. This
is undoubtedly a function of my having lived through the first flush of the digital revolution.
A future list may be better balanced with conceptual frameworks, pedagogies, and social
movements.
Second, several ideas recur, with increasing success in their adoption. Learning objects
were the first attempt at making teaching content reusable, and even though they weren’t
successful, the ideas they generated led to OER, which begat open textbooks. So, those who
have been in the edtech field for a while should be wary of dismissing an idea by saying:
“We tried that; it didn’t work.” Similarly, those proposing a new idea need to understand
why previous attempts failed.
                                               65
               Foundations of Learning and Instructional Design Technology
Third, technology outside of education has consistently been co-opted for educational
purposes. This has met with varying degrees of success. Blogs, for instance, are an ideal
educational technology, whereas Second Life didn’t reach a sustainable adoption. The
popularity of—or the number of Wired headlines about—a technology does not automatically
make it a contender as a useful technology for education.
This leads into the last point: education is a complex, highly interdependent system. It is not
like the banking, record, or media industries. The simple transfer of technology from other
sectors often fails to appreciate the sociocultural context in which education operates.
Generally, only those technologies that directly offer an improved, or alternative, means of
addressing the core functions of education get adopted. These core functions can be
summarized as content, delivery and recognition.28 [#fn28] OER, LMS, and online assessment
all directly map onto these functions. Yet even when there is a clear link, such as between e-
portfolios and recognition, the required cultural shifts can be more significant. Equally,
edtech has frequently failed to address the social impact of advocating for or implementing
a technology beyond the higher education sector. MOOCs, learning analytics, AI, social
media—the widespread adoption of these technologies leads to social implications that
higher education has been guilty of ignoring. The next phase of edtech should be framed
more as a conversation about the specific needs of higher education and the responsibilities
of technology adoption.
When we look back twenty years, the picture is mixed. Clearly, a rapid and fundamental
shift in higher education practice has taken place, driven by technology adoption. Yet at the
same time, nothing much has changed, and many edtech developments have failed to have
significant impact. Perhaps the overall conclusion, then, is that edtech is not a game for the
impatient.
Notes
   1. Audrey Watters, “What Are the Best Books about the History of Education
      Technology? [https://edtechbooks.org/-SIc]” Hack Education (blog), April 5, 2008. ↵
      [#fnr1]
   2. Stephen Downes, “Learning Objects: Resources for Distance Education Worldwide
      [https://edtechbooks.org/-mi],” International Review of Research in Open and
      Distributed Learning2, no. 1 (2001). ↵ [#fnr2]
   3. Robin Mason and D. Rehak, “Keeping the Learning in Learning Objects,” in Allison
      Littlejohn, ed., Reusing Online Resources: A Sustainable Approach to E-Learning
      (London: Kogan Page, 2003). ↵ [#fnr3]
   4. David Wiley, “The Reusability Paradox [https://edtechbooks.org/-xZi]” (August 2002).
      ↵ [#fnr4]
   5. See, e.g., Michael Feldstein, “How and Why the IMS Failed with LTI 2.0
      [https://edtechbooks.org/-Ibn],” e-Literate (blog), November 6, 2017. ↵ [#fnr5]
   6. David Wiley, keynote address [https://edtechbooks.org/-BNN], OER18, Bristol, UK,
      April 19, 2018. ↵ [#fnr6]
                                              66
             Foundations of Learning and Instructional Design Technology
                                          67
              Foundations of Learning and Instructional Design Technology
Suggested Citation
                                           68
                            Martin Weller
He is Director of the OER Research Hub, and the ICDE Chair in OER. He
regularly provide workshops and keynote talks on the use of social media by
academics, open education, and online learning. He blogs
at http://blog.edtechie.net/about/.
                                      69
                II. Learning and Instruction
Many of the activities that LIDT professionals engage in are also completed by other
professionals, such as web designers, curriculum writers, multimedia developers, and
teachers. A powerful difference for LIDT professionals is our understanding of learning and
instructional theory, and our efforts to apply these theories to our LIDT practice. For this
reason, understanding what psychology and science can teach us about how people learn,
and how good instruction is provided, is critical to any effective LIDT professional. The
chapters in this section serve only as a basic starting ground to your pursuit of
understanding in this area. You will learn about how the mind works and remembers
information, and emotional factors in learning such as motivation and self-efficacy. I have
included a classic article by Peg Ertmer and Tim Newby on the "Big 3" learning theories of
behaviorism, cognitivism, and constructivism and a new chapter on sociocultural learning
theories which extend beyond the Big 3. Included are a few chapters on more recent
theoretical developments in the areas of informal learning, internet-based learning
(connectivism), learning communities, and creative learning. Finally, two chapters are
included on instructional theory from Charles Reigeluth, who edited several editions of the
book Instructional-Design Theories and Models and David Merrill, whose First Principles of
Instruction summary of basic instructional principles is perhaps the most well known of
instructional frameworks in our field.
Additional Reading
https://edtechbooks.org/-iT
                                             70
                                             9
Memory
Editor's Notes
    Spielman, R. M., Dumper, K., Jenkins, W., Lacombe, A., Lovett, M., & Perlmutter,
    M. (n.d.). How memory functions. In Psychology. Retrieved from
    https://edtechbooks.org/-vG
Encoding
We get information into our brains through a process called encoding, which is the input of
information into the memory system. Once we receive sensory information from the
environment, our brains label or code it. We organize the information with other similar
information and connect new concepts to existing concepts. Encoding information occurs
through automatic processing and effortful processing.
If someone asks you what you ate for lunch today, more than likely you could recall this
information quite easily. This is known as automatic processing, or the encoding of details
like time, space, frequency, and the meaning of words. Automatic processing is usually done
without any conscious awareness. Recalling the last time you studied for a test is another
                                             71
               Foundations of Learning and Instructional Design Technology
example of automatic processing. But what about the actual test material you studied? It
probably required a lot of work and attention on your part in order to encode that
information. This is known as effortful processing.
What are the most effective ways to ensure that important memories are well encoded?
Even a simple sentence is easier to recall when it is meaningful (Anderson, 1984). Read the
following sentences (Bransford & McCarrell, 1974), then look away and count backwards
from 30 by threes to zero, and then try to write down the sentences (no peeking!).
How well did you do? By themselves, the statements that you wrote down were most likely
confusing and difficult for you to recall. Now, try writing them again, using the following
prompts: bagpipe, ship christening, and parachutist. Next count backwards from 40 by
fours, then check yourself to see how well you recalled the sentences this time. You can see
that the sentences are now much more memorable because each of the sentences was
placed in context. Material is far better encoded when you make it meaningful.
There are three types of encoding. The encoding of words and their meaning is known as
semantic encoding. It was first demonstrated by William Bousfield (1935) in an experiment
in which he asked people to memorize words. The 60 words were actually divided into 4
categories of meaning, although the participants did not know this because the words were
randomly presented. When they were asked to remember the words, they tended to recall
them in categories, showing that they paid attention to the meanings of the words as they
learned them.
Visual encoding is the encoding of images, and acoustic encoding is the encoding of sounds,
words in particular. To see how visual encoding works, read over this list of words: car,
level, dog, truth, book, value. If you were asked later to recall the words from this list, which
ones do you think you’d most likely remember? You would probably have an easier time
recalling the words car, dog, and book, and a more difficult time recalling the words level,
truth, and value. Why is this? Because you can recall images (mental pictures) more easily
than words alone. When you read the words car, dog, and book you created images of these
things in your mind. These are concrete, high-imagery words. On the other hand, abstract
words like level, truth, and value are low-imagery words. High-imagery words are encoded
both visually and semantically (Paivio, 1986), thus building a stronger memory.
Now let’s turn our attention to acoustic encoding. You are driving in your car and a song
comes on the radio that you haven’t heard in at least 10 years, but you sing along, recalling
every word. In the United States, children often learn the alphabet through song, and they
learn the number of days in each month through rhyme: “Thirty days hath September, /
April, June, and November; / All the rest have thirty-one, / Save February, with twenty-eight
days clear, / And twenty-nine each leap year.” These lessons are easy to remember because
                                               72
               Foundations of Learning and Instructional Design Technology
of acoustic encoding. We encode the sounds the words make. This is one of the reasons why
much of what we teach young children is done through song, rhyme, and rhythm.
Which of the three types of encoding do you think would give you the best memory of verbal
information? Some years ago, psychologists Fergus Craik and Endel Tulving (1975)
conducted a series of experiments to find out. Participants were given words along with
questions about them. The questions required the participants to process the words at one
of the three levels. The visual processing questions included such things as asking the
participants about the font of the letters. The acoustic processing questions asked the
participants about the sound or rhyming of the words, and the semantic processing
questions asked the participants about the meaning of the words. After participants were
presented with the words and questions, they were given an unexpected recall or
recognition task.
Words that had been encoded semantically were better remembered than those encoded
visually or acoustically. Semantic encoding involves a deeper level of processing than the
shallower visual or acoustic encoding. Craik and Tulving concluded that we process verbal
information best through semantic encoding, especially if we apply what is called the self-
reference effect. The self-reference effect is the tendency for an individual to have better
memory for information that relates to oneself in comparison to material that has less
personal relevance (Rogers, Kuiper & Kirker, 1977).
Storage
Once the information has been encoded, we somehow have to retain it. Our brains take the
encoded information and place it in storage. Storage is the creation of a permanent record
of information.
In order for a memory to go into storage (i.e., long-term memory), it has to pass through
three distinct stages: Sensory Memory, Short-Term Memory, and finally Long-Term
Memory. These stages were first proposed by Richard Atkinson and Richard Shiffrin (1968).
Their model of human memory (Figure 1), called Atkinson-Shiffrin (A-S), is based on the
belief that we process memories in the same way that a computer processes information.
                                             73
               Foundations of Learning and Instructional Design Technology
    Figure 1. Atkinson & Shiffrin Memory Model. Created by Dkahng and available on
    Wikimedia Commons under a CC-BY, Share Alike license.
But A-S is just one model of memory. Others, such as Baddeley and Hitch (1974), have
proposed a model where short-term memory itself has different forms. In this model, storing
memories in short-term memory is like opening different files on a computer and adding
information. The type of short-term memory (or computer file) depends on the type of
information received. There are memories in visual spatial form, as well as memories of
spoken or written material, and they are stored in three short-term systems: a visuospatial
sketchpad, an episodic buffer, and a phonological loop. According to Baddeley and Hitch, a
central executive part of memory supervises or controls the flow of information to and from
the three short-term systems.
Sensory Memory
In the Atkinson-Shiffrin model, stimuli from the environment are processed first in sensory
memory: storage of brief sensory events, such as sights, sounds, and tastes. It is very brief
storage—up to a couple of seconds. We are constantly bombarded with sensory information.
We cannot absorb all of it, or even most of it. And most of it has no impact on our lives. For
                                              74
               Foundations of Learning and Instructional Design Technology
example, what was your professor wearing the last class period? As long as the professor
was dressed appropriately, it does not really matter what she was wearing. Sensory
information about sights, sounds, smells, and even textures, which we do not view as
valuable information, we discard. If we view something as valuable, the information will
move into our short-term memory system.
One study of sensory memory researched the significance of valuable information on short-
term memory storage. J. R. Stroop discovered a memory phenomenon in the 1930s: you will
name a color more easily if it appears printed in that color, which is called the Stroop effect.
In other words, the word “red” will be named more quickly, regardless of the color the word
appears in, than any word that is colored red. Try an experiment: name the colors of the
words you are given in Figure 2. Do not read the words, but say the color the word is
printed in. For example, upon seeing the word “yellow” in green print, you should say,
“Green,” not “Yellow.” This experiment is fun, but it’s not as easy as it seems.
Short-term Memory
Short-term memory (STM) is a temporary storage system that processes incoming sensory
memory; sometimes it is called working memory. Short-term memory takes information
from sensory memory and sometimes connects that memory to something already in long-
term memory. Short-term memory storage lasts about 20 seconds. George Miller (1956), in
his research on the capacity of memory, found that most people can retain about 7 items in
STM. Some remember 5, some 9, so he called the capacity of STM 7 plus or minus 2.
Think of short-term memory as the information you have displayed on your computer
screen—a document, a spreadsheet, or a web page. Then, information in short-term memory
goes to long-term memory (you save it to your hard drive), or it is discarded (you delete a
document or close a web browser). This step of rehearsal, the conscious repetition of
information to be remembered, to move STM into long-term memory is called memory
consolidation.
                                              75
               Foundations of Learning and Instructional Design Technology
You may find yourself asking, “How much information can our memory handle at once?” To
explore the capacity and duration of your short-term memory, have a partner read the
strings of random numbers (Figure 3) out loud to you, beginning each string by saying,
“Ready?” and ending each by saying, “Recall,” at which point you should try to write down
the string of numbers from memory.
    Figure 3. Work through this series of numbers using the recall exercise explained
    above to determine the longest string of digits that you can store. Image available
    in original OpenStax chapter.
Note the longest string at which you got the series correct. For most people, this will be
close to 7, Miller’s famous 7 plus or minus 2. Recall is somewhat better for random numbers
than for random letters (Jacobs, 1887), and also often slightly better for information we hear
(acoustic encoding) rather than see (visual encoding) (Anderson, 1969).
Long-term Memory
Long-term memory is divided into two types: explicit and implicit (Figure 4). Understanding
the different types is important because a person’s age or particular types of brain trauma
or disorders can leave certain types of LTM intact while having disastrous consequences for
other types. Explicit memories are those we consciously try to remember and recall. For
example, if you are studying for your chemistry exam, the material you are learning will be
part of your explicit memory. (Note: Sometimes, but not always, the terms explicit memory
and declarative memory are used interchangeably.)
                                             76
               Foundations of Learning and Instructional Design Technology
Implicit memories are memories that are not part of our consciousness. They are memories
formed from behaviors. Implicit memory is also called non-declarative memory.
Declarative memory has to do with the storage of facts and events we personally
experienced. Explicit (declarative) memory has two parts: semantic memory and episodic
memory. Semantic means having to do with language and knowledge about language. An
example would be the question “what does argumentative mean?” Stored in our semantic
memory is knowledge about words, concepts, and language-based knowledge and facts. For
example, answers to the following questions are stored in your semantic memory:
                                             77
                Foundations of Learning and Instructional Design Technology
Episodic memory is information about events we have personally experienced. The concept
of episodic memory was first proposed about 40 years ago (Tulving, 1972). Since then,
Tulving and others have looked at scientific evidence and reformulated the theory.
Currently, scientists believe that episodic memory is memory about happenings in particular
places at particular times, the what, where, and when of an event (Tulving, 2002). It
involves recollection of visual imagery as well as the feeling of familiarity (Hassabis &
Maguire, 2007).
Retrieval
So you have worked hard to encode (via effortful processing) and store some important
information for your upcoming final exam. How do you get that information back out of
storage when you need it? The act of getting information out of memory storage and back
into conscious awareness is known as retrieval. This would be similar to finding and opening
a paper you had previously saved on your computer’s hard drive. Now it’s back on your
desktop, and you can work with it again. Our ability to retrieve information from long-term
memory is vital to our everyday functioning. You must be able to retrieve information from
memory in order to do everything from knowing how to brush your hair and teeth, to driving
to work, to knowing how to perform your job once you get there.
There are three ways you can retrieve information out of your long-term memory storage
system: recall, recognition, and relearning. Recall is what we most often think about when
we talk about memory retrieval: it means you can access information without cues. For
example, you would use recall for an essay test. Recognition happens when you identify
information that you have previously learned after encountering it again. It involves a
process of comparison. When you take a multiple-choice test, you are relying on recognition
to help you choose the correct answer. Here is another example. Let’s say you graduated
from high school 10 years ago, and you have returned to your hometown for your 10-year
reunion. You may not be able to recall all of your classmates, but you recognize many of
them based on their yearbook photos.
The third form of retrieval is relearning, and it’s just what it sounds like. It involves learning
information that you previously learned. Whitney took Spanish in high school, but after high
school she did not have the opportunity to speak Spanish. Whitney is now 31, and her
company has offered her an opportunity to work in their Mexico City office. In order to
prepare herself, she enrolls in a Spanish course at the local community center. She’s
                                               78
               Foundations of Learning and Instructional Design Technology
surprised at how quickly she’s able to pick up the language after not speaking it for 13
years; this is an example of relearning.
Summary
Memory is a system or process that stores what we learn for future use.
Our memory has three basic functions: encoding, storing, and retrieving information.
Encoding is the act of getting information into our memory system through automatic or
effortful processing. Storage is retention of the information, and retrieval is the act of
getting information out of storage and into conscious awareness through recall, recognition,
and relearning. The idea that information is processed through three memory systems is
called the Atkinson-Shiffrin (A-S) model of memory. First, environmental stimuli enter our
sensory memory for a period of less than a second to a few seconds. Those stimuli that we
notice and pay attention to then move into short-term memory (also called working
memory). According to the A-S model, if we rehearse this information, then it moves into
long-term memory for permanent storage. Other models like that of Baddeley and Hitch
suggest there is more of a feedback loop between short-term memory and long-term
memory. Long-term memory has a practically limitless storage capacity and is divided into
implicit and explicit memory. Finally, retrieval is the act of getting memories out of storage
and back into conscious awareness. This is done through recall, recognition, and relearning.
Suggested Citation
    Spielman, R. , Dumper, K., Jenkins, W. , Lacombe, A., Lovett, M., & Perlmutter, M
    (2018). Memory. In R. E. West, Foundations of Learning and Instructional Design
    Technology: The Past, Present, and Future of Learning and Instructional Design
    Technology. EdTech Books. Retrieved from
    https://edtechbooks.org/lidtfoundations/memory
                                             79
          Foundations of Learning and Instructional Design Technology
                                        80
                            Rose Spielman
                                       81
                          Kathryn Dumper
                                      82
                           William Jenkins
                                      83
                           Arlene Lacombe
                                       84
                            Marilyn Lovett
                                       85
                        Marion Perlmutter
                                      86
                                            10
                                 Intelligence
                                What is Intelligence?
Editor’s Note
    Spielman, R. M., Dumper, K., Jenkins, W., Lacombe, A., Lovett, M., & Perlmutter,
    M. (n.d.). What are intelligence and creativity? In Psychology. Retrieved from
    https://edtechbooks.org/-Is
The way that researchers have defined the concept of intelligence has been modified many
times since the birth of psychology. British psychologist Charles Spearman believed
intelligence consisted of one general factor, called g, which could be measured and
compared among individuals. Spearman focused on the commonalities among various
intellectual abilities and de-emphasized what made each unique. Long before modern
psychology developed, however, ancient philosophers, such as Aristotle, held a similar view
(Cianciolo & Sternberg, 2004).
                                             87
               Foundations of Learning and Instructional Design Technology
Navigating your way home after being detoured onto an unfamiliar route because of road
construction would draw upon your fluid intelligence. Fluid intelligence helps you tackle
complex, abstract challenges in your daily life, whereas crystallized intelligence helps you
overcome concrete, straightforward problems (Cattell, 1963).
Other theorists and psychologists believe that intelligence should be defined in more
practical terms. For example, what types of behaviors help you get ahead in life? Which
skills promote success? Think about this for a moment. Being able to recite all 44 presidents
of the United States in order is an excellent party trick, but will knowing this make you a
better person?
Robert Sternberg developed another theory of intelligence, which he titled the triarchic
theory of intelligence because it sees intelligence as comprised of three parts (Sternberg,
1988): practical, creative, and analytical intelligence (Figure 1).
This story about the 2007 Virginia Tech shootings illustrates both high and low practical
                                              88
               Foundations of Learning and Instructional Design Technology
intelligences. During the incident, one student left her class to go get a soda in an adjacent
building. She planned to return to class, but when she returned to her building after getting
her soda, she saw that the door she used to leave was now chained shut from the inside.
Instead of thinking about why there was a chain around the door handles, she went to her
class’s window and crawled back into the room. She thus potentially exposed herself to the
gunman. Thankfully, she was not shot. On the other hand, a pair of students was walking on
campus when they heard gunshots nearby. One friend said, “Let’s go check it out and see
what is going on.” The other student said, “No way, we need to run away from the
gunshots.” They did just that. As a result, both avoided harm. The student who crawled
through the window demonstrated some creative intelligence but did not use common
sense. She would have low practical intelligence. The student who encouraged his friend to
run away from the sound of gunshots would have much higher practical intelligence.
Analytical intelligence is closely aligned with academic problem solving and computations.
Sternberg says that analytical intelligence is demonstrated by an ability to analyze,
evaluate, judge, compare, and contrast. When reading a classic novel for literature class, for
example, it is usually necessary to compare the motives of the main characters of the book
or analyze the historical context of the story. In a science course such as anatomy, you must
study the processes by which the body uses various minerals in different human systems. In
developing an understanding of this topic, you are using analytical intelligence. When
solving a challenging math problem, you would apply analytical intelligence to analyze
different aspects of the problem and then solve it section by section.
                                              89
               Foundations of Learning and Instructional Design Technology
Gardner’s theory is relatively new and needs additional research to better establish
empirical support. At the same time, his ideas challenge the traditional idea of intelligence
to include a wider variety of abilities, although it has been suggested that Gardner simply
relabeled what other theorists called “cognitive styles” as “intelligences” (Morgan, 1996).
Furthermore, developing traditional measures of Gardner’s intelligences is extremely
difficult (Furnham, 2009; Gardner & Moran, 2006; Klein, 1997).
Gardner’s inter- and intrapersonal intelligences are often combined into a single type:
emotional intelligence. Emotional intelligence encompasses the ability to understand the
emotions of yourself and others, show empathy, understand social relationships and cues,
and regulate your own emotions and respond in culturally appropriate ways (Parker,
Saklofske, & Stough, 2009). People with high emotional intelligence typically have well-
developed social skills. Some researchers, including Daniel Goleman, the author of
Emotional Intelligence: Why It Can Matter More than IQ, argue that emotional intelligence
is a better predictor of success than traditional intelligence (Goleman, 1995). However,
                                              90
               Foundations of Learning and Instructional Design Technology
emotional intelligence has been widely debated, with researchers pointing out
inconsistencies in how it is defined and described, as well as questioning results of studies
on a subject that is difficult to measure and study empirically (Locke, 2005; Mayer, Salovey,
& Caruso, 2004).
Intelligence can also have different meanings and values in different cultures. If you live on
a small island, where most people get their food by fishing from boats, it would be important
to know how to fish and how to repair a boat. If you were an exceptional angler, your peers
would probably consider you intelligent. If you were also skilled at repairing boats, your
intelligence might be known across the whole island. Think about your own family’s culture.
What values are important for Latino families? Italian families? In Irish families, hospitality
and telling an entertaining story are marks of the culture. If you are a skilled storyteller,
other members of Irish culture are likely to consider you intelligent.
Some cultures place a high value on working together as a collective. In these cultures, the
importance of the group supersedes the importance of individual achievement. When you
visit such a culture, how well you relate to the values of that culture exemplifies your
cultural intelligence, sometimes referred to as cultural competence.
Application Exercises
                                              91
          Foundations of Learning and Instructional Design Technology
Suggested Citation
Spielman, R. , Dumper, K., Jenkins, W. , Lacombe, A., Lovett, M., & Perlmutter, M
(2018). Intelligence: What is Intelligence?. In R. E. West, Foundations of Learning
and Instructional Design Technology: The Past, Present, and Future of Learning
and Instructional Design Technology. EdTech Books. Retrieved from
https://edtechbooks.org/lidtfoundations/intelligence
                                         92
                            Rose Spielman
                                       93
                          Kathryn Dumper
                                      94
                           William Jenkins
                                      95
                           Arlene Lacombe
                                       96
                            Marilyn Lovett
                                       97
                        Marion Perlmutter
                                      98
                                              11
Editor's Note
    This article was originally published in 1993 and then republished in 2013 by
    Performance Improvement Quarterly. © 2013 International Society for
    Performance Improvement Published online in Wiley Online Library
    (wileyonlinelibrary.com). DOI: 10.1002/piq.21143. The original citation is below:
The need for a bridge between basic learning research and educational practice has long
been discussed. To ensure a strong connection between these two areas, Dewey (cited in
Reigeluth, 1983) called for the creation and development of a “linking science”; Tyler (1978)
a “middleman position”; and Lynch (1945) for employing an “engineering analogy” as an aid
for translating theory into practice. In each case, the respective author highlighted the
information and potential contributions of available learning theories, the pressing problems
faced by those dealing with practical learning issues, and a general lack of using the former
to facilitate solutions for the latter. The value of such a bridging function would be its ability
to translate relevant aspects of the learning theories into optimal instructional actions. As
described by Reigeluth (1983, p. 5), the field of Instructional Design performs this role.
Instructional designers have been charged with “translating principles of learning and
instruction into specifications for instructional materials and activities” (Smith & Ragan,
1993, p. 12). To achieve this goal, two sets of skills and knowledge are needed. First, the
                                               99
               Foundations of Learning and Instructional Design Technology
designer must understand the position of the practitioner. In this regard, the following
questions would be relevant: What are the situational and contextual constraints of the
application? What is the degree of individual differences among the learners? What form of
solutions will or will not be accepted by the learners as well as by those actually teaching
the materials? The designer must have the ability to diagnose and analyze practical learning
problems. Just as a doctor cannot prescribe an effective remedy without a proper diagnosis,
the instructional designer cannot properly recommend an effective prescriptive solution
without an accurate analysis of the instructional problem.
In addition to understanding and analyzing the problem, a second core of knowledge and
skills is needed to “bridge” or “link” application with research–that of understanding the
potential sources of solutions (i.e., the theories of human learning). Through this
understanding, a proper prescriptive solution can be matched with a given diagnosed
problem. The critical link, therefore, is not between the design of instruction and an
autonomous body of knowledge about instructional phenomena, but between instructional
design issues and the theories of human learning.
Why this emphasis on learning theory and research? First, learning theories are a source of
verified instructional strategies, tactics, and techniques. Knowledge of a variety of such
strategies is critical when attempting to select an effective prescription for overcoming a
given instructional problem. Second, learning theories provide the foundation for intelligent
and reasoned strategy selection. Designers must have an adequate repertoire of strategies
available, and possess the knowledge of when and why to employ each. This knowledge
depends on the designer’s ability to match the demands of the task with an instructional
strategy that helps the learner. Third, integration of the selected strategy within the
instructional context is of critical importance. Learning theories and research often provide
information about relationships among instructional components and the design of
instruction, indicating how specific techniques/strategies might best fit within a given
context and with specific learners (Keller, 1979). Finally, the ultimate role of a theory is to
allow for reliable prediction (Richey, 1986). Effective solutions to practical instructional
problems are often constrained by limited time and resources. It is paramount that those
strategies selected and implemented have the highest chance for success. As suggested by
Warries (1990), a selection based on strong research is much more reliable than one based
on “instructional phenomena.”
The task of translating learning theory into practical applications would be greatly
simplified if the learning process were relatively simple and straightforward. Unfortunately,
this is not the case. Learning is a complex process that has generated numerous
interpretations and theories of how it is effectively accomplished. Of these many theories,
which should receive the attention of the instructional designer? Is it better to choose one
theory when designing instruction or to draw ideas from different theories? This article
presents three distinct perspectives of the learning process (behavioral, cognitive, and
constructivist) and although each has many unique features, it is our belief that each still
describes the same phenomena (learning). In selecting the theory whose associated
                                             100
               Foundations of Learning and Instructional Design Technology
instructional strategies offers the optimal means for achieving desired outcomes, the degree
of cognitive processing required of the learner by the specific task appears to be a critical
factor. Therefore, as emphasized by Snelbecker (1983), individuals addressing practical
Iearning problems cannot afford the “luxury of restricting themselves to only one theoretical
position… [They] are urged to examine each of the basic science theories which have been
developed by psychologists in the study of learning and to select those principles and
conceptions which seem to be of value for one’s particular educational situation’ (p. 8).
It is expected that after reading this article, instructional designers and educational
practitioners should be better informed “consumers”of the strategies suggested by each
viewpoint. The concise information presented here can serve as an initial base of knowledge
for making important decisions regarding instructional objectives and strategies.
Learning Defined
Learning has been defined in numerous ways by many different theorists, researchers and
educational practitioners. Although universal agreement on any single definition is
nonexistent, many definitions employ common elements. The following definition by Shuell
(as interpreted by Schunk, 1991) incorporates these main ideas: “Learning is an enduring
change in behavior, or in the capacity to behave in a given fashion, which results from
practice or other forms of experience” (p. 2).
Undoubtedly, some learning theorists will disagree on the definition of learning presented
here. However, it is not the definition itself that separates a given theory from the rest. The
major differences among theories lie more in interpretation than they do in definition. These
differences revolve around a number of key issues that ultimately delineate the instructional
prescriptions that flow from each theoretical perspective. Schunk (1991) lists five definitive
questions that serve to distinguish each learning theory from the others:
                                              101
                 Foundations of Learning and Instructional Design Technology
Expanding on this original list, we have included two additional questions important to the
instructional designer:
In this article, each of these questions is answered from three distinct viewpoints:
behaviorism, cognitivism, and constructivism. Although learning theories typically are
divided into two categories–behavioral and cognitive–a third category, constructive, is added
here because of its recent emphasis in the instructional design literature (e.g., Bednar,
Cunningham, Duffy, & Perry, 1991; Duffy & Jonassen, 1991; Jonassen, 1991b; Winn, 1991).
In many ways these viewpoints overlap; yet they are distinctive enough to be treated as
separate approaches to understanding and describing learning. These three particular
positions were chosen because of their importance, both historically and currently, to the
field of instructional design. It is hoped that the answers to the first five questions will
provide the reader with a basic understanding of how these viewpoints differ. The answers
to the last two questions will translate these differences into practical suggestions and
recommendations for the application of these principles in the design of instruction.
These seven questions provide the basis for the article’s structure. For each of the three
theoretical positions, the questions are addressed and an example is given to illustrate the
application of that perspective. It is expected that this approach will enable the reader to
compare and contrast the different viewpoints on each of the seven issues.
As is common in any attempt to compare and contrast similar products, processes, or ideas,
differences are emphasized in order to make distinctions clear. This is not to suggest that
there are no similarities among these viewpoints or that there are no overlapping features.
In fact, different learning theories will often prescribe the same instructional methods for
the same situations (only with different terminology and possibly with different intentions).
This article outlines the major differences between the three positions in an attempt to
facilitate comparison. It is our hope that the reader will gain greater insight into what each
viewpoint offers in terms of the design and presentation of materials, as well as the types of
learning activities that might be prescribed.
Historical Foundations
Current learning theories have roots that extend far into the past. The problems with which
                                              102
               Foundations of Learning and Instructional Design Technology
today’s theorists and researchers grapple and struggle are not new but simply variations on
a timeless theme: Where does knowledge come from and how do people come to know? Two
opposing positions on the origins of knowledge-empiricism and rationalism have existed for
centuries and are still evident, to varying degrees, in the learning theories of today. A brief
description of these views is included here as a background for comparing the “modern”
learning viewpoints of behaviorism, cognitivism, and constructivism.
Empiricism is the view that experience is the primary source of knowledge (Schunk, 1991).
That is, organisms are born with basically no knowledge and anything learned is gained
through interactions and associations with the environment. Beginning with Aristotle
(384-322 B.C.), empiricists have espoused the view that knowledge is derived from sensory
impressions. Those impressions, when associated contiguously in time and/or space, can be
hooked together to form complex ideas. For example, the complex idea of a tree, as
illustrated by Hulse, Egeth, and Deese (1980), can be built from the less complex ideas of
branches and leaves, which in turn are built from the ideas of wood and fiber, which are
built from basic sensations such as greenness, woody odor, and so forth. From this
perspective, critical instructional design issues focus on how to manipulate the environment
in order to improve and ensure the occurrence of proper associations.
Rationalism is the view that knowledge derives from reason without the aid of the senses
(Schunk, 1991). This fundamental belief in the distinction between mind and matter
originated with Plato (c. 427-347 B.C.), and is reflected in the viewpoint that humans learn
by recalling or “discovering” what already exists in the mind. For example, the direct
experience with a tree during one’s lifetime simply serves to reveal that which is already in
the mind. The “real” nature of the tree (greenness, woodiness, and other characteristics)
becomes known, not through the experience, but through a reflection on one’s idea about
the given instance of a tree. Although later rationalists differed on some of Plato’s other
ideas, the central belief remained the same: that knowledge arises through the mind. From
this perspective, instructional design issues focus on how best to structure new information
in order to facilitate (1) the learners’ encoding of this new information, as well as (2) the
recalling of that which is already known.
The empiricist, or associationist, mindset provided the framework for many learning
theories during the first half of this century, and it was against this background that
behaviorism became the leading psychological viewpoint (Schunk, 1991). Because
behaviorism was dominant when instructional theory was initiated (around 1950), the
instructional design (ID) technology that arose alongside it was naturally influenced by
many of its basic assumptions and characteristics. Since ID has its roots in behavioral
theory, it seems appropriate that we turn our attention to behaviorism first.
Behaviorism
                                             103
               Foundations of Learning and Instructional Design Technology
Behaviorism equates learning with changes in either the form or frequency of observable
performance. Learning is accomplished when a proper response is demonstrated following
the presentation of a specific environmental stimulus. For example, when presented with a
math flashcard showing the equation “2 + 4 = ?” the learner replies with the answer of “6.”
The equation is the stimulus and the proper answer is the associated response. The key
elements are the stimulus, the response, and the association between the two. Of primary
concern is how the association between the stimulus and response is made, strengthened,
and maintained.
Although both learner and environmental factors are considered important by behaviorists,
environmental conditions receive the greatest emphasis. Behaviorists assess the learners to
determine at what point to begin instruction as well as to determine which reinforcers are
most effective for a particular student. The most critical factor, however, is the arrangement
of stimuli and consequences within the environment.
Transfer refers to the application of learned knowledge in new ways or situations, as well as
to how prior learning affects new learning. In behavioral learning theories, transfer is a
result of generalization. Situations involving identical or similar features allow behaviors to
transfer across common elements. For example, the student who has learned to recognize
and classify elm trees demonstrates transfer when (s)he classifies maple trees using the
same process. The similarities between the elm and maple trees allow the learner to apply
the previous elm tree classification learning experience to the maple tree classification task.
                                              104
               Foundations of Learning and Instructional Design Technology
Behaviorists attempt to prescribe strategies that are most useful for building and
strengthening stimulus-response associations (Winn, 1990), including the use of
instructional cues, practice, and reinforcement. These prescriptions have generally been
proven reliable and effective in facilitating learning that involves discriminations (recalling
facts), generalizations (defining and illustrating concepts), associations (applying
explanations), and chaining (automatically performing a specified procedure). However, it is
generally agreed that behavioral principles cannot adequately explain the acquisition of
higher level skills or those that require a greater depth of processing (e.g., language
development, problem solving, inference generating, critical thinking) (Schunk, 1991).
Many of the basic assumptions and characteristics of behaviorism are embedded in current
instructional design practices. Behaviorism was used as the basis for designing many of the
early audio-visual materials and gave rise to many related teaching strategies, such as
Skinner’s teaching machines and programmed texts. More recent examples include
principles utilized within computer-assisted instruction (CAI) and mastery learning.
Specific assumptions or principles that have direct relevance to instructional design include
the following (possible current ID applications are listed in italics and brackets following the
listed principle):
The goal of instruction for the behaviorist is to elicit the desired response from the learner
who is presented with a target stimulus. To accomplish this, the learner must know how to
execute the proper response, as well as the conditions under which that response should be
made. Therefore, instruction is structured around the presentation of the target stimulus
and the provision of opportunities for the learner to practice making the proper response.
To facilitate the linking of stimulus-response pairs, instruction frequently uses cues (to
initially prompt the delivery of the response) and reinforcement (to strengthen correct
                                              105
               Foundations of Learning and Instructional Design Technology
Behavioral theories imply that the job of the teacher/designer is to (1) determine which cues
can elicit the desired responses; (2) arrange practice situations in which prompts are paired
with the target stimuli that initially have no eliciting power but which will be expected to
elicit the responses in the “natural” (performance) setting; and (3) arrange environmental
conditions so that students can make the correct responses in the presence of those target
stimuli and receive reinforcement for those responses (Gropper, 1987).
Cognitivism
In the late 1950’s, learning theory began to make a shift away from the use of behavioral
models to an approach that relied on learning theories and models from the cognitive
sciences. Psychologists and educators began to de-emphasize a concern with overt,
observable behavior and stressed instead more complex cognitive processes such as
thinking, problem solving, language, concept formation and information processing
(Snelbecker, 1983). Within the past decade, a number of authors in the field of instructional
design have openly and consciously rejected many of ID’s traditional behavioristic
assumptions in favor of a new set of psychological assumptions about learning drawn from
the cognitive sciences. Whether viewed as an open revolution or simply a gradual
evolutionary process, there seems to be the general acknowledgment that cognitive theory
has moved to the forefront of current learning theories (Bednar et al., 1991). This shift from
a behavioral orientation (where the emphasis is on promoting a student’s overt performance
by the manipulation of stimulus material) to a cognitive orientation (where the emphasis is
on promoting mental processing) has created a similar shift from procedures for
manipulating the materials to be presented by an instructional system to procedures for
directing student processing and interaction with the instructional design system (Merrill,
Kowalis, & Wilson, 1981).
Cognitive theories stress the acquisition of knowledge and internal mental structures and,
                                             106
               Foundations of Learning and Instructional Design Technology
as such, are closer to the rationalist end of the epistemology continuum (Bower & Hilgard,
1981). Learning is equated with discrete changes between states of knowledge rather than
with changes in the probability of response. Cognitive theories focus on the
conceptualization of students’ learning processes and address the issues of how information
is received, organized, stored, and retrieved by the mind. Learning is concerned not so
much with what learners do but with what they know and how they come to acquire it
(Jonassen, 1991b). Knowledge acquisition is described as a mental activity that entails
internal coding and structuring by the learner. The learner is viewed as a very active
participant in the learning process.
Cognitivism, like behaviorism, emphasizes the role that environmental conditions play in
facilitating learning. Instructional explanations, demonstrations, illustrative examples and
matched non-examples are all considered to be instrumental in guiding student learning.
Similarly, emphasis is placed on the role of practice with corrective feedback. Up to this
point, little difference can be detected between these two theories. However, the “active”
nature of the learner is perceived quite differently. The cognitive approach focuses on the
mental activities of the learner that lead up to a response and acknowledges the processes
of mental planning, goal-setting, and organizational strategies (Shuell, 1986). Cognitive
theories contend that environmental “cues” and instructional components alone cannot
account for all the learning that results from an instructional situation. Additional key
elements include the way that learners attend to, code, transform, rehearse, store and
retrieve information. Learners’ thoughts, beliefs, attitudes, and values are also considered
to be influential in the learning process (Winne, 1985). The real focus of the cognitive
approach is on changing the learner by encouraging him/her to use appropriate learning
strategies.
As indicated above, memory is given a prominent role in the learning process. Learning
results when information is stored in memory in an organized, meaningful manner.
Teachers/designers are responsible for assisting learners in organizing that information in
some optimal way. Designers use techniques such as advance organizers, analogies,
hierarchical relationships, and matrices to help learners relate new information to prior
knowledge. Forgetting is the inability to retrieve information from memory because of
interference, memory loss, or missing or inadequate cues needed to access information.
                                             107
               Foundations of Learning and Instructional Design Technology
1991). Prior knowledge is used to establish boundary constraints for identifying the
similarities and differences of novel information. Not only must the knowledge itself be
stored in memory but the uses of that knowledge as well. Specific instructional or real-world
events will trigger particular responses, but the learner must believe that the knowledge is
useful in a given situation before he will activate it.
Because of the emphasis on mental structures, cognitive theories are usually considered
more appropriate for explaining complex forms of learning (reasoning, problem-solving,
information-processing) than are those of a more behavioral perspective (Schunk, 1991).
However, it is important to indicate at this point that the actual goal of instruction for both
of these viewpoints is often the same: to communicate or transfer knowledge to the students
in the most efficient, effective manner possible (Bednar et al., 1991). Two techniques used
by both camps in achieving this effectiveness and efficiency of knowledge transfer are
simplification and standardization. That is, knowledge can be analyzed, decomposed, and
simplified into basic building blocks. Knowledge transfer is expedited if irrelevant
information is eliminated. For example, trainees attending a workshop on effective
management skills would be presented with information that is “sized” and “chunked” in
such a way that they can assimilate and/or accommodate the new information as quickly and
as easily as possible. Behaviorists would focus on the design of the environment to optimize
that transfer, while cognitivists would stress efficient processing strategies.
Many of the instructional strategies advocated and utilized by cognitivists are also
emphasized by behaviorists, yet usually for different reasons. An obvious commonality is the
use of feedback. A behaviorist uses feedback (reinforcement) to modify behavior in the
desired direction, while cognitivists make use of feedback (knowledge of results) to guide
and support accurate mental connections (Thompson, Simonson, & Hargrave, 1992).
Learner and task analyses are also critical to both cognitivists and behaviorists, but once
again, for different reasons. Cognitivists look at the learner to determine his/her
predisposition to learning (i.e., How does the learner activate, maintain, and direct his/her
learning?) (Thompson et al., 1992). Additionally, cognitivists examine the learner to
determine how to design instruction so that it can be readily assimilated (i.e., What are the
learner’s existing mental structures?). In contrast, the behaviorists look at learners to
determine where the lesson should begin (i.e., At what level are they currently performing
successfully?) and which reinforcers should be most effective (i.e., What consequences are
most desired by the learner?).
Specific assumptions or principles that have direct relevance to instructional design include
the following (possible current ID applications are listed in italics and brackets following the
                                              108
                Foundations of Learning and Instructional Design Technology
listed principle):
   1. Emphasis on the active involvement of the learner in the learning process [learner
      control, metacognitive training (e.g., self-planning, monitoring, and revising
      techniques)]
   2. Use of hierarchical analyses to identify and illustrate prerequisite relationships
      [cognitive task analysis procedures]
   3. Emphasis on structuring, organizing, and sequencing information to facilitate optimal
      processing [use of cognitive strategies such as outlining, summaries, synthesizers,
      advance organizers, etc.]
   4. Creation of learning environments that allow and encourage students to make
      connections with previously learned material [recall of prerequisite skills; use of
      relevant examples, analogies]
Behavioral theories imply that teachers ought to arrange environmental conditions so that
students respond properly to presented stimuli. Cognitive theories emphasize making
knowledge meaningful and helping learners organize and relate new information to existing
knowledge in memory. Instruction must be based on a student’s existing mental structures,
or schema, to be effective. It should organize information in such a manner that learners are
able to connect new information with existing knowledge in some meaningful way.
Analogies and metaphors are examples of this type of cognitive strategy. For example,
instructional design textbooks frequently draw an analogy between the familiar architect’s
profession and the unfamiliar instructional design profession to help the novice learner
conceptualize, organize and retain the major duties and functions of an instructional
designer (e.g. Reigeluth, 1983, p. 7). Other cognitive strategies may include the use of
framing, outlining, mnemonics, concept mapping, advance organizers and so forth (West,
Farmer, & Wolff, 1991).
Such cognitive emphases imply that major tasks of the teacher/designer include (1)
understanding that individuals bring various learning experiences to the learning situation
which can impact learning outcomes; (2) determining the most effective manner in which to
organize and structure new information to tap the learners’ previously acquired knowledge,
abilities, and experiences; and (3) arranging practice with feedback so that the new
information is effectively and efficiently assimilated and/or accommodated within the
learner’s cognitive structure (Stepich & Newby, 1988).
                                            109
               Foundations of Learning and Instructional Design Technology
process by which the individual allocates his monthly paycheck, how (s)he makes a buy/no-
buy decision regarding the purchase of a luxury item, or even how one’s weekend spending
activities might be determined and prioritized. The procedures for such activities may not
exactly match those of the cost-benefit analysis, but the similarity between the activities
allows for the unfamiliar information to be put within a familiar context. Thus processing
requirements are reduced and the potential effectiveness of recall cues is increased.
Constructivism
The philosophical assumptions underlying both the behavioral and cognitive theories are
primarily objectivistic; that is: the world is real, external to the learner. The goal of
instruction is to map the structure of the world onto the learner (Jonassen, 1991b). A
number of contemporary cognitive theorists have begun to question this basic objectivistic
assumption and are starting to adopt a more constructivist approach to learning and
understanding: knowledge “is a function of how the individual creates meaning from his or
her own experiences” (p.10). Constructivism is not a totally new approach to learning. Like
most other learning theories, constructivism has multiple roots in the philosophical and
psychological viewpoints of this century, specifically in the works of Piaget, Bruner, and
Goodman (Perkins, 1991). In recent years, however, constructivism has become a “hot”
issue as it has begun to receive increased attention in a number of different disciplines,
including instructional design (Bednar et al., 1991).
Constructivism is a theory that equates learning with creating meaning from experience
(Bednar et al., 1991). Even though constructivism is considered to be a branch of
cognitivism (both conceive of learning as a mental activity), it distinguishes itself from
traditional cognitive theories in a number of ways. Most cognitive psychologists think of the
mind as a reference tool to the real world; constructivists believe that the mind filters input
from the world to produce its own unique reaIity (Jonassen, 1991a). Like with the
rationalists of Plato’s time, the mind is believed to be the source of all meaning, yet like the
empiricists, individual, direct experiences with the environment are considered critical.
Constructivism crosses both categories by emphasizing the interaction between these two
variables.
Constructivists do not share with cognitivists and behaviorists the belief that knowledge is
mind-independent and can be “mapped” onto a learner. Constructivists do not deny the
existence of the real world but contend that what we know of the world stems from our own
interpretations of our experiences. Humans create meaning as opposed to acquiring it.
Since there are many possible meanings to glean from any experience, we cannot achieve a
predetermined, “correct” meaning. Learners do not transfer knowledge from the external
world into their memories; rather they build personal interpretations of the world based on
individual experiences and interactions. Thus, the internal representation of knowledge is
constantly open to change; there is not an objective reality that learners strive to know.
                                              110
               Foundations of Learning and Instructional Design Technology
Both learner and environmental factors are critical to the constructivist, as it is the specific
interaction between these two variables that creates knowledge. Constructivists argue that
behavior is situationally determined (Jonassen, 1991a). Just as the learning of new
vocabulary words is enhanced by exposure and subsequent interaction with those words in
context (as opposed to learning their meanings from a dictionary), likewise it is essential
that content knowledge be embedded in the situation in which it is used. Brown, Collins, and
Duguid (1989) suggest that situations actually co-produce knowledge (along with cognition)
through activity. Every action is viewed as “an interpretation of the current situation based
on an entire history of previous interactions” (Clancey, 1986). Just as shades of meanings of
given words are constantly changing a learner’s “current” understanding of a word, so too
will concepts continually evolve with each new use. For this reason, it is critical that
learning occur in realistic settings and that the selected learning tasks be relevant to the
students’ lived experience.
The goal of instruction is not to ensure that individuals know particular facts but rather that
they elaborate on and interpret information. “Understanding is developed through
continued, situated use … and does not crystallize into a categorical definition” that can be
called up from memory (Brown et al., 1989, p. 33). As mentioned earlier, a concept will
continue to evolve with each new use as new situations, negotiations, and activities recast it
in a different, more densely textured form. Therefore, “memory” is always under
construction as a cumulative history of interactions. Representations of experiences are not
formalized or structured into a single piece of declarative knowledge and then stored in the
head. The emphasis is not on retrieving intact knowledge structures, but on providing
learners with the means to create novel and situation-specific understandings by
“assembling” prior knowledge from diverse sources appropriate to the problem at hand. For
example, the knowledge of “design” activities has to be used by a practitioner in too many
different ways for them all to be anticipated in advance. Constructivists emphasize the
flexible use of pre-existing knowledge rather than the recall of prepackaged schemas (Spiro,
Feltovich, Jacobson, & Coulson, 1991). Mental representations developed through task-
engagement are likely to increase the efficiency with which subsequent tasks are performed
to the extent that parts of the environment remain the same: “Recurring features of the
environment may thus afford recurring sequences of actions” (Brown et al., p. 37). Memory
is not a context-independent process.
Clearly the focus of constructivism is on creating cognitive tools which reflect the wisdom of
the culture in which they are used as well as the insights and experiences of individuals.
                                              111
               Foundations of Learning and Instructional Design Technology
There is no need for the mere acquisition of fixed, abstract, self-contained concepts or
details. To be successful, meaningful, and lasting, learning must include all three of these
crucial factors: activity (practice), concept (knowledge), and culture (context) (Brown et al.,
1989).
The constructivist view does not accept the assumption that types of learning can be
identified independent of the content and the context of learning (Bednar et al., 1991).
Constructivists believe that it is impossible to isolate units of information or divide up
knowledge domains according to a hierarchical analysis of relationships. Although the
emphasis on performance and instruction has proven effective in teaching basic skills in
relatively structured knowledge domains, much of what needs to be learned involves
advanced knowledge in ill-structured domains. Jonassen (1991a) has described three stages
of knowledge acquisition (introductory, advanced, and expert) and argues that constructive
learning environments are most effective for the stage of advanced knowledge acquisition,
where initial misconceptions and biases acquired during the introductory stage can be
discovered, negotiated, and if necessary, modified and/or removed. Jonassen agrees that
introductory knowledge acquisition is better supported by more objectivistic approaches
(behavioral and/or cognitive) but suggests a transition to constructivistic approaches as
learners acquire more knowledge which provides them with the conceptual power needed to
deal with complex and ill-structured problems.
The constructivist designer specifies instructional methods and strategies that will assist
learners in actively exploring complex topics/environments and that will move them into
                                              112
               Foundations of Learning and Instructional Design Technology
thinking in a given content area as an expert user of that domain might think. Knowledge is
not abstract but is linked to the context under study and to the experiences that the
participants bring to the context. As such, learners are encouraged to construct their own
understandings and then to validate, through social negotiation, these new perspectives.
Content is not prespecified; information from many sources is essential. For example, a
typical constructivist’s goal would not be to teach novice ID students straight facts about
instructional design, but to prepare students to use ID facts as an instructional designer
might use them. As such, performance objectives are not related so much to the content as
they are to the processes of construction.
Some of the specific strategies utilized by constructivists include situating tasks in real-
world contexts, use of cognitive apprenticeships (modeling and coaching a student toward
expert performance), presentation of multiple perspectives (collaborative learning to
develop and share alternative views), social negotiation (debate, discussion,
evidencegiving), use of examples as real “slices of life,” reflective awareness, and providing
considerable guidance on the use of constructive processes.
The following are several specific assumptions or principles from the constructivist position
that have direct relevance for the instructional designer (possible ID applications are listed
in italics and brackets following the listed principle):
   1. An emphasis on the identification of the context in which the skills will be learned and
      subsequently applied [anchoring learning in meaningful contexts].
   2. An emphasis on learner control and the capability of the learner to manipulate
      information [actively using what is learned].
   3. The need for information to be presented in a variety of different ways [revisiting
      content at different times, in rearranged contexts, for different purposes, and from
      different conceptual perspectives].
   4. Supporting the use of problem-solving skills that allow learners to go “beyond the
      information given.” [developing pattern-recognition skills, presenting alternative ways
      of representing problems].
   5. Assessment focused on transfer of knowledge and skills [presenting new problems and
      situations that differ from the conditions of the initial instruction].
                                             113
               Foundations of Learning and Instructional Design Technology
bear on a particular problem, and to arrive at self-chosen positions to which they can
commit themselves, while realizing the basis of other views with which they may disagree”
(Cunningham, 1991, p. 14).
Even though the emphasis is on learner construction, the instructional designer/ teacher’s
role is still critical (Reigeluth, 1989). Here the tasks of the designer are two-fold: (1) to
instruct the student on how to construct meaning, as well as how to effectively monitor,
evaluate, and update those constructions; and (2) to align and design experiences for the
learner so that authentic, relevant contexts can be experienced.
Although constructivist approaches are used quite frequently in the preparation of lawyers,
doctors, architects, and businessmen through the use of apprenticeships and on-the-job
training, they are typically not applied in the educational arena (Resnick, 1987). If they
were, however, a student placed in the hands of a constructivist would likely be immersed in
an “apprenticeship” experience. For example, a novice instructional design student who
desires to learn about needs assessment would be placed in a situation that requires such an
assessment to be completed. Through the modeling and coaching of experts involved in
authentic cases, the novice designer would experience the process embedded in the true
context of an actual problem situation. Over time, several additional situations would be
experienced by the student, all requiring similar needs assessment abilities. Each
experience would serve to build on and adapt that which has been previously experienced
and constructed. As the student gained more confidence and experience, (s)he would move
into a collaborative phase of learning where discussion becomes crucial. By talking with
others (peers, advanced students, professors, and designers), students become better able
to articulate their own understandings of the needs assessment process. As they uncover
their naive theories, they begin to see such activities in a new light, which guides them
towards conceptual reframing (learning). Students gain familiarity with analysis and action
in complex situations and consequently begin to expand their horizons: they encounter
relevant books, attend conferences and seminars, discuss issues with other students, and
use their knowledge to interpret numerous situations around them (not only related to
specific design issues). Not only have the learners been involved in different types of
learning as they moved from being novices to “budding experts,” but the nature of the
learning process has changed as well.
General Discussion
It is apparent that students exposed to the three instructional approaches described in the
examples above would gain different competencies. This leads instructors/designers to ask
two significant questions: Is there a single “best” approach and is one approach more
efficient than the others? Given that learning is a complex, drawn-out process that seems to
be strongly influenced by one’s prior knowledge, perhaps the best answer to these questions
is “it depends.” Because learning is influenced by many factors from many sources, the
learning process itself is constantly changing, both in nature and diversity, as it progresses
(Shuell, 1990). What might be most effective for novice learners encountering a complex
                                             114
               Foundations of Learning and Instructional Design Technology
body of knowledge for the first time, would not be effective, efficient or stimulating for a
learner who is more familiar with the content. Typically, one does not teach facts the same
way that concepts or problem-solving are taught; likewise, one teaches differently
depending on the proficiency level of the learners involved. Both the instructional strategies
employed and the content addressed (in both depth and breadth) would vary based on the
level of the learners.
So how does a designer facilitate a proper match between learner, content, and strategies?
Consider, first of all, how learners’ knowledge changes as they become more familiar with a
given content. As people acquire more experience with a given content, they progress along
a low-to-high knowledge continuum from 1) being able to recognize and apply the standard
rules, facts, and operations of a profession (knowing what), to 2) thinking like a professional
to extrapolate from these general rules to particular, problematic cases (knowing how), to 3)
developing and testing new forms of understanding and actions when familiar categories
and ways of thinking fail (reflection-in-action) (Schon, 1987). In a sense, the points along
this continuum mirror the points of the learning theory continuum described earlier.
Depending on where the learners “sit” on the continuum in terms of the development of
their professional knowledge (knowing what vs. knowing how vs. reflection-in-action), the
most appropriate instructional approach for advancing the learners’ knowledge at that
particular level would be the one advocated by the theory that corresponds to that point on
the continuum. That is, a behavioral approach can effectively facilitate mastery of the
content of a profession (knowing what); cognitive strategies are useful in teaching problem-
solving tactics where defined facts and rules are applied in unfamiliar situations (knowing
how); and constructivist strategies are especially suited to dealing with ill-defined problems
through reflection-in-action.
A second consideration depends upon the requirements of the task to be learned. Based on
the level of cognitive processing required, strategies from different theoretical perspectives
may be needed. For example, tasks requiring a low degree of processing (e.g., basic paired
associations, discriminations, rote memorization) seem to be facilitated by strategies most
frequently associated with a behavioral outlook (e.g., stimulus-response, contiguity of
feedback/reinforcement). Tasks requiring an increased level of processing (e.g.,
classifications, rule or procedural executions) are primarily associated with strategies
having a stronger cognitive emphasis (e.g., schematic organization, analogical reasoning,
algorithmic problem solving). Tasks demanding high levels of processing (e.g., heuristic
problem solving, personal selection and monitoring of cognitive strategies) are frequently
best learned with strategies advanced by the constructivist perspective (e.g., situated
learning, cognitive apprenticeships, social negotiation).
We believe that the critical question instructional designers must ask is not “Which is the
best theory?” but “Which theory is the most effective in fostering mastery of specific tasks
by specific learners?” Prior to strategy(ies) selection, consideration must be made of both
the learners and the task. An attempt is made in Figure 1 to depict these two continua
(learners’ level of knowledge and cognitive processing demands) and to illustrate the degree
                                             115
               Foundations of Learning and Instructional Design Technology
to which strategies offered by each of the theoretical perspectives appear applicable. The
figure is useful in demonstrating: (a) that the strategies promoted by the different
perspectives overlap in certain instances (i.e., one strategy may be relevant for each of the
different perspectives, given the proper amount of prior knowledge and the corresponding
amount of cognitive processing), and (b) that strategies are concentrated along different
points of the continua due to the unique focus of each of the learning theories. This means
that when integrating any strategies into the instructional design process, the nature of the
learning task (i.e., the level of cognitive processing required) and the proficiency level of the
learners involved must both be considered before selecting one approach over another.
Depending on the demands of the task and where the learners are in terms of the content to
be delivered/discovered, different strategies based on different theories appear to be
necessary. Powerful frameworks for instruction have been developed by designers inspired
by each of these perspectives. In fact, successful instructional practices have features that
are supported by virtually all three perspectives (e.g., active participation and interaction,
practice and feedback).
                                              116
               Foundations of Learning and Instructional Design Technology
For this reason, we have consciously chosen not to advocate one theory over the others, but
to stress instead the usefulness of being well versed in each. This is not to suggest that one
should work without a theory, but rather that one must be able to intelligently choose, on
the basis of information gathered about the learners’ present level of competence and the
type of learning task, the appropriate methods for achieving optimal instructional outcomes
in that situation.
As stated by Smith and Ragan (1993, p. viii): “Reasoned and validated theoretical
eclecticism has been a key strength of our field because no single theoretical base provides
complete prescriptive principles for the entire design process.” Some of the most crucial
design tasks involve being able to decide which strategy to use, for what content, for which
students, and at what point during the instruction. Knowledge of this sort is an example of
conditional knowledge, where “thinking like” a designer becomes a necessary competency.
It should be noted however, that to be an eclectic, one must know a lot, not a little, about
the theories being combined. A thorough understanding of the learning theories presented
above seems to be essential for professional designers who must constantly make decisions
for which no design model provides precise rules. Being knowledgeable about each of these
theories provides designers with the flexibility needed to be spontaneous and creative when
a first attempt doesn’t work or when they find themselves limited by time, budget, and/or
personnel constraints. The practitioner cannot afford to ignore any theories that might
provide practical implications. Given the myriad of potential design situations, the
designer’s “best” approach may not ever be identical to any previous approach, but will
truly “depend upon the context.” This type of instructional “cherry-picking” has been termed
“systematic eclecticism” and has had a great deal of support in the instructional design
literature (Snelbecker, 1989).
And to this we would add that we also need adaptive learners who are able to function well
when optimal conditions do not exist, when situations are unpredictable and task demands
change, when the problems are messy and ill-formed and the solutions depend on
inventiveness, improvisation, discussion, and social negotiation.
References
Bednar, A. K., Cunningham, D., Duffy, T. M., & Perry, J. D. (1991). Theory into practice: How
do we link? In G. J. Anglin (Ed.), Instructional technology: Past, present, and future.
Englewood, CO: Libraries Unlimited.
                                             117
                Foundations of Learning and Instructional Design Technology
Bower, G. H., & Hilgard, E. R. (1981). Theories of learning (5th ed.). Englewood Cliffs, NJ:
Prentice-Hall.
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning.
Educational Researcher, 18(1), 32-42.
Bruner, J. S. (1971). The process of education revisited. Phi Delta Kappan, 53,18-21.
Duffy, T. M., & Jonassen, D. (1991). Constructivism: New implications for instructional
technology? Educational Technology, 31(5), 3-12.
Hulse, S. H., Egeth, H., & Deese, J. (1980). The psychology of learning (5th ed.). New York:
McGraw-Hill.
Merrill, M. D., Kowalis, T., & Wilson, B. G. (1981). Instructional design in transition. In F. H.
Farley, & N. J. Gordon (Eds.), Psychology and education: The state of the union (pp.
298-348). Berkeley: McCutchan.
                                              118
                Foundations of Learning and Instructional Design Technology
Reigeluth (Ed.), Instructional theories in action (pp. 3-36). Hillsdale, NJ: Lawrence Erlbaum
Associates.
Reigeluth, C. M. (1989). Educational technology at the crossroads: New mindsets and new
directions. Educational Technology Research and Development, 37(1), 67-80.
Resnick, L. B. (1987). Learning in school and out. Educational Researcher, 16(9), 13-20.
Richey, R. D. (1986). The theoretical and conceptual bases of instructional design. New
York: Nichols.
Smith, P. L., & Ragan, T. J. (1993). Instructional design. New York: Macmillan.
Spiro, R. J., Feltovich, P. J., Jacobson, M. J., & Coulson, R. L. (1991). Cognitive flexibility,
constructivism, and hypertext: Random access instruction for advanced knowledge
acquisition in ill-structured domains. Educational Technology, 31(5), 24-33.
Stepich, D. A., & Newby, T. J. (1988). Analogical instruction within the information
processing paradigm: Effective means to facilitate learning. Instructional Science, 17,
129-144.
Tyler, R. W. (1978). How schools utilize educational research and development. In R. Glaser
(Ed.), Research and development and school change. Hillsdale, NJ: Lawrence Erlbaum.
                                                119
               Foundations of Learning and Instructional Design Technology
Warries, E. (1990). Theory and the systematic design of instruction. In S. Dijkstra, B. van
Hout Wolters, & P. C. van der Sijde, (Eds.), Research on instruction: Design and effects (pp.
1-19). Englewood Cliffs, NJ: Educational Technology.
West, C. K., Farmer, J. A., & Wolff, P. M. (1991). Instructional design: Implications from
cognitive science. Englewood Cliffs, NJ: Prentice Hall.
Additional Reading
Application Exercises
                                             120
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        121
                          Peggy A. Ertmer
                                     122
                           Timothy Newby
                                      123
                                             12
When considering theories of learning, LIDT professionals should also consider sociocultural
perspectives and the role that culture, interaction, and collaboration have on quality
learning. Modern social learning theories stem from the work of Russian psychologist
Vygotsky, who produced his ideas between 1924 and 1934 as a reaction to existing
conflicting approaches in psychology (Kozulin, 1990). Vygotsky’s ideas are most recognized
for identifying the role social interactions and culture play in the development of higher-
order thinking skills, and it is especially valuable for the insights it provides about the
dynamic “interdependence between individual and social processes in the construction of
knowledge” (John-Steiner & Mahn, 1996, p. 192). Vygotsky’s views are often considered
primarily as developmental theories, focusing on qualitative changes in behavior over time
as attempts to explain unseen processes of development of thought, language, and higher-
order thinking skills. Although Vygotsky’s intent was mainly to understand higher
psychological processes in children, his ideas have many implications and practical
applications for learners of all ages.
Interpretations of Vygotsky’s and other sociocultural scholars’ work have led to diverse
perspectives and a variety of new approaches to education. Today, sociocultural theory and
related approaches are widely recognized and accepted in psychology and education and
are especially valued in the field of applied linguistics because of its underlying notion that
language and thought are connected. Sociocultural theory is also becoming increasingly
influential in the field of instructional design. In this chapter, we first review some of the
fundamental principles of sociocultural theory of learning. We then suggest design
implications for learning, teaching, and education in general. Following, we consider how
sociocultural theories of learning should influence instructional design.
Three themes are often identified with Vygotsky’s ideas of sociocultural learning: (1) human
development and learning originate in social, historical, and cultural interactions, (2) use of
psychological tools, particularly language, mediate development of higher mental functions,
and (3) learning occurs within the Zone of Proximal Development. While we discuss these
                                              124
                Foundations of Learning and Instructional Design Technology
Rogoff (1990) refers to this process as guided participation, where a learner actively
acquires new culturally valuable skills and capabilities through a meaningful, collaborative
activity with an assisting, more experienced other. It is critical to notice that these culturally
mediated functions are viewed as being embedded in sociocultural activities rather than
being self-contained. Development is a “transformation of participation in a sociocultural
activity” not a transmission of discrete cultural knowledge or skills (Matusov, 2015, p. 315).
The processes of guided participation reveal the Vygotskian view of cognitive development
“as the transformation of socially shared activities into internalized processes,” or an act of
enculturation, thus rejecting the Cartesian dichotomy between the internal and the external
(John-Steiner & Mahn, 1996, p. 192).
This Vygotskian notion of social learning stands in contrast to more popular Piaget’s ideas of
cognitive development, which assume that development through certain stages is
biologically determined, originates in the individual, and precedes cognitive complexity.
This difference in assumptions has significant implications to the design and development of
learning experiences. If we believe as Piaget did that development precedes learning, then
we will make sure that new concepts and problems are not introduced until learners have
developed innate capabilities to understand them. On the other hand, if we believe as
Vygotsky did that learning drives development and that development occurs as we learn a
variety of concepts and principles, recognizing their applicability to new tasks and new
situations, then our instructional design will look very different. We will ensure that
instructional activities are structured in ways that promote individual student learning and
development. We will know that it is the process of learning that enables achievement of
higher levels of development, which in turn affects “readiness to learn a new concept”
(Miller, 2011, p. 197). In essence:
                                               125
               Foundations of Learning and Instructional Design Technology
Another implication based on Vygotskian views of learning is recognizing that there are
individual differences as well as cross-cultural differences in learning and development. As
instructional designers, we should be more sensitive to diversity in learners and recognize
that a large amount of research has been done on white, middle-class individuals associated
with Western tradition, and the resulting understanding of development and learning often
incorrectly assumes universality. Recognizing that “ideal thinking and behavior may differ
for different cultures” and that “different historical and cultural circumstances may
encourage different developmental routes to any given developmental endpoint” may
prevent incorrect universalist views of all individuals and allow for environments that value
diversity as a resource (Miller, 2011, p. 198).
Vygotsky viewed language as the ultimate collection of symbols and tools that emerge
within a culture. It is potentially the greatest tool at our disposal, a form of a symbolic
mediation that plays two critical roles in development: to communicate with others and to
construct meaning.
Learning occurs within the zone of proximal development. Probably the most widely
applied sociocultural concept in the design of learning experiences is the concept of the
Zone of Proximal Development (ZPD). Vygotsky (1978) defined ZPD as “the distance
between the actual developmental level as determined by independent problem solving and
the level of potential development as determined through problem solving under adult
guidance or in collaboration with more capable peers” (p. 86). He believed that learning
should be matched with an individual’s developmental level and that in order to understand
the connection between development and learning, it is necessary to distinguish the actual
                                             126
               Foundations of Learning and Instructional Design Technology
and the potential levels of development. Learning and development are best understood
when the focus is on processes rather than their products. He considered the ZPD to be a
better and more dynamic indicator of cognitive development since it reflects what the
learner is in the process of learning as compared to merely measuring what the learner can
accomplish independently, reflecting what has been already learned (Vygotsky, 1978).
Vygotsky argued that productive interactions align instruction toward the ZPD, and
providing instruction and guidance within the ZPD allows a learner to develop skills and
strategies they will eventually apply on their own in other situations (1978). This highlights
the importance of instructional decisions related to types and quality of interactions in
designing effective learning experiences. Whether these interactions occur with a more
experienced other or another learner with similar skills, there should always be a degree of
common understanding about the task, described as intersubjectivity., The partners should
have a sense of shared authority over the process, and they should actively collaborate to
co-construct understanding. It is important to notice that ZPD should be viewed broadly as
“any situation in which some activity is leading individuals beyond their current level of
functioning,” applicable not only to instructional activities but to play, work, and many other
activities (Miller, 2011, p. 178).
The notion of instructional scaffolding is closely related to the idea of ZPD. Scaffolding is the
set of tools or actions that help a learner successfully complete a task within ZPD.
Scaffoldings typically include a mutual and dynamic nature of interaction where both the
learner and the one providing the scaffold influence each other and adjust their behavior as
they collaborate. The types and the extent of supports provided in a learning experience are
based on performance, and the scaffold is gradually phased out (Miller, 2011). The expert
motivates and guides the learner by providing just enough assistance, modeling, and
highlighting critical features of the task as well as continually evaluating and adjusting
supports as needed. Additionally, providing opportunities for reflection as part of the
learning experience further promotes more complex, meaningful, and lasting learning
experiences. In the case of digital learning experiences, scaffolds are not necessarily
provided by individuals, but may be embedded into the experience.
Ideas such as ZPD and scaffolding bring to light a fundamentally different view of an
instructor who serves more as a facilitator of learning rather than a fount of knowledge.
Likewise, the learner takes on more responsibilities such as determining their learning
goals, becoming a resource of knowledge for peers, and actively collaborating in the
learning process (Grabinger, Aplin, & Ponnappa-Brenner, 2007). This shift in roles promotes
individualized, differentiated, and learner-centered types of instruction, and when
accompanied with effective pedagogical practices, it has a potential to become a powerful
alternative for reforming current educational systems and creating environments where
many different individuals develop deep understanding of important subjects (Watson &
Reigeluth, 2016).
                                              127
                Foundations of Learning and Instructional Design Technology
Sociocultural theory has several widely recognized strengths. First, it emphasizes the
broader social, cultural, and historical context of any human activity. It does not view
individuals as isolated entities; rather, it provides a richer perspective, focusing on the fluid
boundary between self and others. It portrays the dynamic of a learner acquiring knowledge
and skills from the society and then in turn shaping their environment (Miller, 2011).
Second, sociocultural theory is sensitive to individual and cross-cultural diversity. In
contrast to many other universalist theories, sociocultural theory acknowledges both
differences in individuals within a culture and differences in individuals across cultures. It
recognizes that “different historical and cultural circumstances may encourage different
developmental routes to any given developmental endpoint” depending on particular social
or physical circumstances and tools available (Miller, 2011, p. 198). Finally, sociocultural
theory greatly contributes to our theoretical understanding of cognitive development by
integrating the notion of learning and development. The idea of learning driving
development rather than being determined by a developmental level of the learner
fundamentally changes our understanding of the learning process and has significant
instructional and educational implications (Miller, 2011).
There are also limitations to the sociocultural perspective. The first limitation is related to
Vygotsky’s premature death, as many of his theories remained incomplete. Furthermore, his
work was largely unknown until fairly recently due to political reasons and issues with
translation. The second major limitation is associated with the vagueness of the ZPD.
Individuals may have wide or narrow zones, which may be both desirable and undesirable,
depending on the circumstances. Knowing only the width of the zone “does not provide an
accurate picture of [the learner’s]learning, ability, style of learning, and current level of
development compared to other children of the same age and degree of motivation” (Miller,
2011, p. 198). Additionally, there is little known about whether a child’s zone is comparable
across different learning domains, with different individuals, and whether the size of the
zone changes over time. here is also not a common metric scale to measure ZPD. Finally,
Rogoff (1990) pointed out that Vygotsky’s theories may not be as relevant to all cultures as
originally thought. She provides an example of scaffolding being heavily dependent on
verbal instruction and thus not equally effective in all cultures for all types of learning.
The notion of social origins of learning, the interrelationship of language and thought, and
the Zone of Proximal Development are Vygotsky’s most important contributions. However,
the practical applications of sociocultural theory are also significant that emphasize creating
learner-centered instructional environments where learning by discovery, inquiry, active
problem solving, and critical thinking are fostered through collaboration with experts and
peers in communities of learners and encourage self-directed lifelong learning habits.
Presenting authentic and cognitively challenging tasks within a context of collaborative
activities, scaffolding learner’s efforts by providing a structure and support to accomplish
complex tasks, and providing opportunities for authentic and dynamic assessment are all
important aspects of this approach. Sociocultural principles can be applied in effective and
                                              128
               Foundations of Learning and Instructional Design Technology
meaningful ways to design instruction across the curriculum for learners of different ages
and variety of skills, and they can be effectively integrated using a wide range of
technologies and learning environments. The challenge remains for educators and
instructional designers to elevate our practices from efficient, systemic approaches for
teaching and instructional design to a focus on individual learners and effective pedagogical
practices to develop empowered learners ready to successfully negotiate the rapidly
changing era of information. Technology is at our fingertips, and it is up to us to
competently implement its unique affordances to promote new ways to educate and support
deep, meaningful, and self-directed learning. Grounding our practices in sociocultural
theory can significantly aid our efforts.
Sociocultural theory and related ideas provide a valuable contribution to a focus on the
learner within their social, cultural, and historical context and also offer sound pedagogical
solutions and strategies that facilitate development of critical thinking and lifelong learning
(Grabinger, Aplin, & Ponnappa-Brenner, 2007). The American Psychological Association’s
Learner-Centered Principles (APA Work Group, 1997, p. 6) stated the following about social
interactions on individual learners: “In interactive and collaborative instructional contexts,
individuals have an opportunity for perspective taking and reflective thinking that may lead
to higher levels of cognitive, social, and moral development, as well as self-esteem.”
Most instructional design models take into consideration a common or isolated concept of
the learner, but recently, a strong call has been issued for a complete shift in our education
and instructional design approaches to reflect our society’s changing educational needs
(Watson & Reigeluth, 2016). More contemporary design approaches, such as Universal
Design for Learning, recognize that every learner is unique and influenced by his or her
embedded context. These approaches strive to provide challenging and engaging curricula
for diverse learners while also designing for the social influences that surround them.
                                              129
               Foundations of Learning and Instructional Design Technology
discourse, which is more than cooperative learning. This is visible, for example, in the ideas
ofsituated cognition (situated learning) and cognitive apprenticeships.
Brown, Collins, and Duguid (1989), seminal authors on situated cognition, contended that
“activity and situations are integral to cognition and learning” (p. 32). By socially interacting
with others in real life contexts, learning occurs on deeper levels. They explained that
“people who use tools actively rather than just acquire them, by contrast, build an
increasingly rich implicit understanding of the world in which they use the tools and of the
tools themselves” (Brown, Collins, & Duguid, 1989, p. 33).
This implicit understanding of the world around them influences how learners understand
and respond to instruction. In one study, Carraher, Carraher, and Schliemann (1985)
researched Brazilian children solving mathematics problems while selling produce. While
selling produce, the context and artifacts positively influenced a child’s ability to work
through mathematics problems, use appropriate strategies, and find correct solutions.
However, these children failed to solve the same problems when they were presented out of
context in conventional mathematical form. Lave (1988) studied tailors in Liberia and found
that while the tailors were adept at solving mathematics problems embedded in their daily
work, they could not apply those same skills to novel contexts. In addition, Brill (2001)
synthesized the work of Collins (1988) and identified four benefits of using situated
cognition as a theory guiding teaching and instructional design: (1) learners develop the
ability to apply knowledge; (2) learners become effective problem solvers after learning in
novel and diverse settings; (3) learners are able to see the implications of knowledge; and
(4) learners receive support in organizing knowledge in ways to use later.
                                              130
               Foundations of Learning and Instructional Design Technology
     Shared Repertoire: Finally, as part of its practice, the community produces a set of
     communal resources, which is termed their shared repertoire.
                                            131
               Foundations of Learning and Instructional Design Technology
Collaborative environments that encourage learners to think critically and apply knowledge
and skills is a central component of social learning theories. As educators strive to create
cooperative learning experiences for students, authentic activities and anchored instruction
promote sociocultural perspectives of learning by encouraging the contextualization of
learning in the simulation of practical problems, the development of cultural skills through
guided participation in collaborative groups, and the use of language to both communicate
and internalize learning. The implementation of collaborative, authentic activities in
learning experiences typically involves learners collaborating to solve problems embedded
in real-life situations (Reeves et al., 2002), reflecting learning through situated cognition.
Teachers, trainers, and facilitators guide and support these collaborative efforts by
scaffolding learning with tools and resources, asking questions that support learners’
understanding, and helping learners to make sense of the problems.
Authentic activities contextualize learning and allow for a diverse application of skills and
knowledge within real-world scenarios. In the literature these authentic activities have
sometimes been referred to as anchors or the process of anchored instruction, which
focuses learners on developing knowledge and skills through collaborative problem solving
experiences (Bransford, Sherwood, Hasselbring, Kinzer, & Williams, 1990). This type of
learning allows students to engage in problem solving within learning contexts that provide
for connection-building across the curriculum in order to develop meaning (Bransford et al.,
1990). Typically presented in a narrative format, anchored learning begins with the
“anchor,” or story in which the problem is set, and uses multimedia outlets to allow students
to explore the problem and develop multiple solutions (Bransford et al., 1990). As students
collaborate and engage with the material, the teacher becomes a coach and guides students
along the process. Through both authentic activities and anchored instruction, learning
takes place in a social setting, encouraging students to develop, share, and implement
creative solutions to complex problems as collaborative teams.
                                             132
               Foundations of Learning and Instructional Design Technology
questions and facilitating discussions of the information in the adventure as well as the
mathematics concepts embedded in the situation. Research from the project indicated that
learners showed greater understanding of how to solve mathematics problems than their
peers who had not participated (Hickey, Moore, & Pellegrino, 2001).
Project-based Learning
Project-based learning engages learners in collaborative situations where they must address
a complex problem or real-world challenge. According to Vygotsky’s ideas, this collaborative
learning style naturally fosters students’ development of higher-order thinking skills.
Problem-based learning environments have been empirically linked to K-12 students gaining
a deeper understanding of content and greater amount of learner engagement compared to
more traditional instruction (Condliffe, Visher, Bangser, Drohojowska, & Saco, 2016;
Fogelman, McNeill, & Krajcik, 2011).
This instructional method derived from problem-based learning, which was first introduced
at McMaster University in Ontario, Canada in 1969 (O’Grady, Yew, Goh, & Schmidt, 2012,
p. 21). Although they are alike, problem-based learning and project-based learning
traditionally differ in scope and size. Unlike the former, the latter requires students to work
together to concurrently master several learning objectives as they apply newly acquired
skills and knowledge embedded in several problems to solve (Capraro, Capraro, and
Morgan, 2013).
Due to the complexity of these situations, most enactments of project-based learning involve
learners working in teams on these tasks (Condliffe et al., 2016). Project-based work that is
collaborative, however, teaches students how to prioritize and apportion tasks within the
project (Garcia, 2017). It also promotes student-initiated inquiry, scaffolding, and soft skill
development in areas such as collaboration and communication.
                                              133
               Foundations of Learning and Instructional Design Technology
In theory, the flipped classroom model is an excellent way to maximize social learning under
the facilitation of a teacher. In practice, however, it does have some drawbacks, including
the additional amount of time teachers must invest in preparing the video assignments,
ensuring all students have access to the videos outside of school, and making sure all
students complete their video-lecture assignments prior to class. The research literature
indicates that there are evidence-based solutions to several of these drawbacks such as
offering student incentives, giving quizzes and student feedback during the videos, and
devoting some in-class time to check for student understanding (Educause, 2012; Brame,
2013).
Research evidence has indicated significant student learning gains in the flipped classroom
model (Brame, 2013), emphasizing the value of learning in a social context (e.g., discussion,
project collaboration, debate, student-led inquiry, etc.). Not only is social learning
maximized in a flipped classroom, the levels of learning are reversed in comparison to a
traditional classroom; therefore, students are engaged in higher levels of cognitive work (in
regards to Bloom’s revised taxonomy of learning) amongst their peers as they engage in
lower levels of learning on their own outside of class (Brame, 2013).
                                             134
               Foundations of Learning and Instructional Design Technology
interests and problem-solving skills, opportunities for collaboration and reflection, and
adaptations to individual and cultural needs, educators can facilitate authentic experiences
and learning communities for their students in these online spaces (Bonk & Cunningham,
1998).
With an array of online resources available, there are a variety of avenues through which
students can virtually collaborate. Deal (2009) proposed a process through which learning
occurs in an online collaborative space: communication, team definition and participants,
project management, resource management, co-creation and ideation, consensus building,
and presenting and archiving. Initially, students must communicate and organize roles to
complete an objective, which can be completed through online resources such as email,
instant messaging, virtual conferencing (such as Skype or Google Hangouts), or discussion
boards (Deal, 2009). In an online collaborative environment, students must also find ways to
share and establish ideas through project management, resource management, and co-
creation programs, such as Google Drive, Google Docs, wikis, and virtual whiteboards (Deal,
2009). Finally, once the project has been organized and at its final stage, students can use
online resources to create a final product, such as a webinar, video, or slideshow.
Throughout all components of the online collaboration process, teachers have opportunities
for assessment, including evaluating the process, final product, or specific outcomes.
Ultimately, as students make use of the variety of online resources to navigate a meaningful
learning activity as prescribed by an instructor, social learning provides for the refinement
of both content knowledge and critical thinking skills (Stahl, Koschmann, & Suthers, 2006;
Scardamalia & Bereiter, 1994).
For learners of all ages, establishing roles provides support to students to facilitate the
completion of learning activities (Antil, Jenkins, Watkins, 1998). Kagan (1999) developed the
acronym PIES to represent elements of collaborative learning: positive interdependence,
individual accountability, equal participation, and simultaneous interaction. Positive
interdependence refers to the idea that the potential work that can be done by the group is
greater than if each individual in the group worked alone. Individual accountability means
that learners are each responsible for some aspects of the work. Equal participation refers
to relatively fair shares of the work required. Simultaneous interaction refers to the idea
that learners are working together at the same time on the project instead of a jigsaw
approach where learners work on their own on separate pieces that are compiled at the end
of the work.
                                             135
               Foundations of Learning and Instructional Design Technology
For instructional designers who are creating social learning experiences for adults, the
tasks must be complex enough to foster positive interdependence and hold individuals
accountable. This may include grouping individuals from different backgrounds. If
employees of a bank were participating in training on new financial guidelines, an
instructional designer may design learning activities in which teams encompassed a
mortgage consultant, a retirement consultant, a manager, and a teller. The scenarios
included in the training would vary as to require the expertise and background of each to be
used in discussing and solving the problem.
K-12 teachers should intentionally establish collaborative learning experiences for students
that involve projects, authentic tasks, and other activities embedded in contexts. In order to
facilitate collaboration, creating learning teams or groups in which students have specific
roles is suggested. For example, in an elementary school classroom, a teacher may put
learners in groups and assign the following roles:
      Leader/facilitator: Individual in charge of organizing the group and keeping the group
      on task.
      Recorder: Individual who records and organizes notes, information, and data.
      Timekeeper: Individual who keeps time and makes sure things are completed in a
      timely manner.
      Spokesperson: Individual in charge of finalizing the project and leading the
      presentation
The intentional establishment of learning teams is fundamental for both K-12 teachers and
instructional designers in facilitating social learning experiences.
As stated previously, social perspectives of learning embrace the idea of situated cognition
that learning is embedded within specific contexts. For K-12 teachers, the challenge is
identifying authentic contexts for learners. Culture, geography, and students’ backgrounds
clearly must be taken into consideration when identifying contexts for social learning
experiences. Students on the coast of Florida have authentic contexts that are different from
those in a rural town in the midwestern United States. As a result, the development of
curriculum, instructional materials, and resources for these types of experiences cannot be
a one-size-fits-all approach, and should provide opportunities for teachers to modify the
activities to ensure that they are authentic to their students.
Further, it is critical to make sure that learning experiences provide opportunities for
learners to work within an authentic context but also provide generalizations or
opportunities to apply their knowledge and skills in other settings. For example, after high
school students study economic concepts of supply and demand in the context of
researching the prices of brands of clothes popular in that area, students should have
opportunities to apply those concepts in a new context.
                                             136
               Foundations of Learning and Instructional Design Technology
For an instructional designer, an authentic setting is a realistic scenario the learners may
experience. Instructional designers typically design training for individuals that is directly
related to their work. For instance, creating training for lifeguards about CPR and first aid
certification could include cases and scenarios that require multiple individuals to
participate and collaboratively problem solve. This could include scenarios that require an
individual to role play someone who is choking and groups of people to identify how to
remove the object causing the individual to choke. During the learning segment individuals
take turns role playing and collaborating to identify and solve various problems.
Scaffolding Learners
In social learning experiences both K-12 teachers and instructional designers must create
learning activities that include scaffolds and supports for learners. Social learning
experiences are guided by teachers or learning facilitators without significant direct
teaching and presentation. This does not mean that the teacher is absent or off in the
corner; rather, they should leverage strategies such as posing questions, providing
examples, or supporting students’ collaboration to support these learning experiences.
Scaffolding can occur in a few ways. First, teachers can serve as a scaffold by providing
initial guidance or questions to help students launch into the activity. As the activity
continues, teachers can decrease or remove the amount of support that they provide or limit
their support to specific instances, such as when learners are stuck and unable to continue
with the task. An instructional designer may design training for salesmen in which learners
collaborate to learn about new strategies and receive ongoing feedback from the facilitator
and other employees. However, after time, the amount of feedback and support decreases.
Similar types of support can occur in K-12 classrooms when teachers provide feedback and
guidance early on and then withdraw the scaffolds over time. For example, in an elementary
school mathematics classroom a teacher may provide a conversion table between units of
measurement for a group project at first, and then after students have had time to work
with the measurement units take the conversion table away.
Second, teachers and facilitators can provide external scaffolds or learning tools. An
instructional designer who is training salesmen about new procedures may provide a
document and visual to help learners become familiar with the new procedures at the
beginning of their learning experience, but after collaborative activities and feedback, the
supporting documents may be removed, requiring learners to rely on each other or their
memory. Likewise for K-12 teachers in a middle school science classroom, students studying
landforms may be given access to an anchor chart or visual of different types of landforms
initially to help them identify and classify landforms that they are learning about. After time,
however, the teacher may remove the scaffold so that learners must rely on knowledge and
each other as they lean on skills they have developed together. The amount of scaffolding
that teachers should provide is a fine balance between teachers over-guiding on the one
hand and on the other letting learners falter in a way that is not productive (CTGV, 1997).
                                              137
               Foundations of Learning and Instructional Design Technology
There are a variety of ways in which technology can support the use of social learning
theories in the classroom. Through current and emerging online collaborative spaces, such
as Google, Skype, wikis, and more, as well as hands-on collaborative technology in the
classroom, such as SMART Tables and iPads, students have robust opportunities to
experience meaningful collaborative learning in both physical and virtual settings that
embody the tenets of sociocultural learning. Different technological and online tools can
assist with greater communication strategies, more realistic simulations of real-world
problem scenarios, and even greater flexibility when seeking to scaffold instruction within
students’ ZPD. Embracing the use of technology within collaborative learning can also foster
a more equal distribution of voices as compared to in-person groupings (Deal, 2009),
potentially providing greater opportunity to ensure active participation among all
students.Through using technology to support the implementation of social learning
theories in the classroom, students experience collaboration while refining 21st century
skills.
While the array of technology available to support social learning is beneficial, the volume of
resources available for online and in-person technology-based collaboration may be
overwhelming to some groups of students. Considering the amount of scaffolding needed
based on individual class needs may be appropriate to ensure technology is being used most
productively. By providing students with useful resources in an online environment or being
explicit about technology use within a physical classroom, students may be able to better
focus on the actual problem-solving task rather than filtering through different platforms.
                                             138
               Foundations of Learning and Instructional Design Technology
Application Exercises
References
Antil, L., Jenkins, J., & Watkins, S. (1998). Cooperative learning: Prevalence,
conceptualizations, and the relation between research and practice. American Educational
Research Journal, 35(3), 419-454.
Bonk, C. J., & Cunningham, D. J. (1998). Searching for learner-centered, constructivist, and
sociocultural components of collaborative educational learning tools. In C. J. Bonk, & K. S.
King (Eds.), Electronic collaborators: Learner-centered technologies for literacy,
apprenticeship, and discourse (pp. 25-50). Mahwah, NJ: Erlbaum.
Brame, C. (2013). Flipping the classroom. Vanderbilt University Center for Teaching.
Retrieved from https://edtechbooks.org/-ZV
Bransford, J. D., Sherwood, R. D., Hasselbring, T. S., Kinzer, C. K., & Williams, S. M. (1990).
Anchored instruction: Why we need it and how technology can help. In D. Nix & R. Sprio
(Eds.), Cognition, education and multimedia (pp. 115-141). Hillsdale, NJ: Erlbaum
Associates.
Brown, J.S., Collins, A., Duguid, P. (1989). Situated cognition and the culture of learning.
Educational Researcher, 18(1). pp. 32-42
                                              139
               Foundations of Learning and Instructional Design Technology
https://www.cmu.edu/teaching/technology/whitepapers/CollaborationTools_Jan09.pdf
[https://edtechbooks.org/-GS]
Educause Learning Institute. (2012). 7 things you should know about…flipped classroom.
Retrieved from https://net.educause.edu/ir/library/pdf/eli7081.pdf
[https://edtechbooks.org/-yC]
Fogleman, J., McNeill, K. L., & Krajcik, J. (2011). Examining the effect of teachers’
adaptations of a middle school science inquiry-oriented curriculum unit on student learning.
Journal of Research in Science Teaching, 48(2), 149-169.
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based
environment: Computer conferencing in higher education [https://edtechbooks.org/-ZVv].
The Internet and Higher Education, 2(2-3), 87-105.
Garrison, D. R., & Akyol, Z. (2012). The community of inquiry theoretical framework. In M.
G. Moore (Ed). Handbook of Distance Education (3rd edition), 104-120.
Grabinger, S., Aplin, C., & Ponnappa-Brenner, G. (2007). Instructional design for
sociocultural
Hickey, D. T., Moore, A. L., & Pellegrino, J. (2001). The motivation and academic
consequences
John-Steiner, V., & Mahn, H. (1996). Sociocultural approaches to learning and development:
A Vygotskian framework. Educational Psychologist, 31(3/4), 191-206.
Lave, J. (1988). Cognition in practice: Mind, mathematics and culture in everyday life.
Cambridge: Cambridge University Press.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation.
                                             140
                Foundations of Learning and Instructional Design Technology
Lou, N. & Peek, K. (2016, February 23). By the numbers: The rise of the makerspace.
Popular Science, March/April 2016. Retrieved from https://edtechbooks.org/-mU
Miller, P. (2011). Theories of developmental psychology (5th ed.). New York, NY: Worth
Publishers.
O’Grady, G., Yew, E., Goh, K. P., & Schmidt, H. (Eds.). (2012). One-day, one-problem: An
approach to problem-based learning. Springer Science & Business Media.
Reeves, T. C., Herrington, J., Oliver, R. (2002). Authentic activities and online learning. In A.
Goody, J. Herrington , & M. Northcote (Eds.), Quality conversations: Research and
development in higher education, Volume 25 (pp. 562- 567). Jamison, ACT: HERDSA.
                                              141
               Foundations of Learning and Instructional Design Technology
Stahl, G., Koschmann, T., & Suthers, D. (2006). Computer-supported collaborative learning:
An historical perspective. In R. K. Sawyer (Ed.), Cambridge handbook of the learning
sciences (pp. 409-426). Cambridge, UK: Cambridge University Press.
Vygotsky, L. S. (1986). Thought and language. A. Kozulin (Trans.) Cambridge, MA: The MIT
Press.
Suggested Citation
    Polly, D., Allman, B. , Casto, A., & Norwood, J. (2018). Sociocultural Perspectives of
    Learning. In R. E. West, Foundations of Learning and Instructional Design
    Technology: The Past, Present, and Future of Learning and Instructional Design
    Technology. EdTech Books. Retrieved from
    https://edtechbooks.org/lidtfoundations/sociocultural_perspectives_of_learning
                                             142
                              Drew Polly
                                     143
                           Bohdana Allman
                                       144
                         Amanda R. Casto
                                    145
                           Jessica Norwood
                                       146
                                            13
                     Learning Communities
                       How Do You Define a Community?
Editor’s note
    The following article was first published under an open license in Educational
    Technology Research and Development with the following citation:
    West, R. E. & Williams, G. (2018). I don’t think that word means what you think it
    means: A proposed framework for defining learning communities. Educational
    Technology Research and Development. Available online at
    https://edtechbooks.org/-hA.
A strong learning community “sets the ambience for life-giving and uplifting experiences
necessary to advance an individual and a whole society” (Lenning and Ebbers 1999
[https://edtechbooks.org/-MYC]); thus the learning community has been called “a key
feature of 21st century schools” (Watkins 2005 [https://edtechbooks.org/-BTu]) and a
“powerful educational practice” (Zhao and Kuh 2004 [https://edtechbooks.org/-fE]).
Lichtenstein (2005 [https://edtechbooks.org/-gk]) documented positive outcomes of student
participation in learning communities such as higher retention rates, higher grade point
averages, lower risk of academic withdrawal, increased cognitive skills and abilities, and
improved ability to adjust to college. Watkins (2005 [https://edtechbooks.org/-BTu]) pointed
to a variety of positive outcomes from emphasizing the development of community in
schools and classes, including higher student engagement, greater respect for diversity of
all students, higher intrinsic motivation, and increased learning in the areas that are most
important. In addition, Zhao and Kuh (2004 [https://edtechbooks.org/-fE]) found learning
communities associated with enhanced academic performance; integration of academic and
social experiences; gains in multiple areas of skill, competence, and knowledge; and overall
satisfaction with the college experience.
                                            147
               Foundations of Learning and Instructional Design Technology
Because of the substantial learning advantages that research has found for strong learning
communities, teachers, administrators, researchers, and instructional designers must
understand how to create learning communities that provide these benefits. Researchers
and practitioners have overloaded the literature with accounts, studies, models, and
theories about how to effectively design learning communities. However, synthesizing and
interpreting this scholarship can be difficult because researchers and practitioners use
different terminology and frameworks for conceptualizing the nature of learning
communities. Consequently, many become confused about what a learning community is or
how to measure it.
In this chapter we address ways learning communities can be operationalized more clearly
so research is more effective, based on a thorough review of the literature described in our
other article (West & Williams, 2017).
In the literature, learning communities can mean a variety of things, which are certainly not
limited to face-to-face settings. Some researchers use this term to describe something very
narrow and specific, while others use it for broader groups of people interacting in diverse
ways, even though they might be dispersed through time and space. Learning communities
can be as large as a whole school, or as small as a classroom (Busher 2005
[https://edtechbooks.org/-vba]) or even a subgroup of learners from a larger cohort who
work together with a common goal to provide support and collaboration (Davies et al. 2005
[https://edtechbooks.org/-Syw]). The concept of community emerges as an ambiguous term
in many social science fields.
                                              148
               Foundations of Learning and Instructional Design Technology
and confusion” (p. 217). When a concept or image is particularly fuzzy, some find it helpful
to focus on the edges (boundaries) to identify where “it” begins and where “it” ends, and
then work inward to describe the thing more explicitly. We will apply this strategy to
learning communities and seek to define a community by its boundaries.
However, researchers have different ideas about what those boundaries are (Glynn 1981
[https://edtechbooks.org/-hag]; Lenning and Ebbers 1999 [https://edtechbooks.org/-MYC];
McMillan and Chavis 1986 [https://edtechbooks.org/-EP]; Royal and Rossi 1996
[https://edtechbooks.org/-mw]) and which boundaries are most critical for defining a
learning community. In our review of the literature, we found learning community
boundaries often defined in terms of participants’ sense that they share access,
relationships, vision, or function (see Fig. 1 [https://edtechbooks.org/-yL]). Each of these
boundaries contributes in various ways to different theoretical understandings of a learning
community.
Access might have been at one point the easiest way to define a community. If people lived
                                             149
               Foundations of Learning and Instructional Design Technology
close together, they were a community. If the children attended the same school or
classroom, then they were a school or class community. Some researchers and teachers
continue to believe that defining a community is that simple (For example, Kay et al., 2011
[https://edtechbooks.org/-uL]).
                                            150
               Foundations of Learning and Instructional Design Technology
Being engaged in a learning community often requires more than being present either
physically or virtually. Often researchers define learning communities by their relational or
emotional boundaries: the emotional ties that bind and unify members of the community
(Blanchard et al. 2011 [https://edtechbooks.org/-Ra]). Frequently a learning community is
identified by how close or connected the members feel to each other emotionally and
whether they feel they can trust, depend on, share knowledge with, rely on, have fun with,
and enjoy high quality relationships with each other (Kensler et al. 2009
[https://edtechbooks.org/-WA]). In this way, affect is an important aspect of determining a
learning community. Often administrators or policymakers attempt to force the formation of
a community by having the members associate with each other, but the sense of community
is not discernible if the members do not build the necessary relational ties. In virtual
communities, students may feel present and feel that others are likewise discernibly
involved in the community, but still perceive a lack of emotional trust or connection.
In our review of the literature, we found what seem to be common relational characteristics
of learning communities: (1) sense of belonging, (2) interdependence or reliance among the
members, (3) trust among members, and (4) faith or trust in the shared purpose of the
community.
Belonging
Members of a community need to feel that they belong in the community, which includes
feeling like one is similar enough or somehow shares a connection to the others. Sarason
(1974 [https://edtechbooks.org/-MvQ]) gave an early argument for the psychological needs
of a community, which he defined in part as the absence of a feeling of loneliness. Other
researchers have agreed that an essential characteristic of learning communities is that
students feel “connected” to each other (Baker and Pomerantz 2000
[https://edtechbooks.org/-qjV]) and that a characteristic of ineffective learning communities
is that this sense of community is not present (Lichtenstein 2005
[https://edtechbooks.org/-gk]).
Interdependence
                                             151
               Foundations of Learning and Instructional Design Technology
Trust
Some researchers have listed trust as a major characteristic of learning communities (Chen
et al. 2007 [https://edtechbooks.org/-LZ]; Mayer et al. 1995 [https://edtechbooks.org/-uhV];
Rovai et al. 2004 [https://edtechbooks.org/-CR]). Booth’s (2012
[https://edtechbooks.org/-tDg]) focus on online learning communities is one example of how
trust is instrumental to the emotional strength of the learning group. “Research has
established that trust is among the key enablers for knowledge sharing in online
communities” (Booth 2012 [https://edtechbooks.org/-tDg], p. 5). Related to trust is the
feeling of being respected and valued within a community, which is often described as
essential to a successful learning community (Lichtenstein 2005
[https://edtechbooks.org/-gk]). Other authors describe this feeling of trust or respect as
feeling “safe” within the community (Baker and Pomerantz 2000
[https://edtechbooks.org/-qjV]). For example, negative or ineffective learning communities
have been characterized by conflicts or instructors who were “detached or critical of
students and unable or unwilling to help them” (Lichtenstein 2005
[https://edtechbooks.org/-gk], p. 348).
Shared Faith
These emotional boundaries not only define face-to-face learning communities, but they
define virtual communities as well—perhaps more so. Because virtual communities do not
have face-to-face interaction, the emotional bond that members feel with the persons
beyond the computer screen may be even more important, and the emergence of video
technologies is one method for increasing these bonds (Borup et al. 2014
[https://edtechbooks.org/-XQ]).
Communities defined by shared vision or sense of purpose are not as frequently discussed
as boundaries based on relationships, but ways members of a community think about their
group are important. Rather than feeling like a member of a community—with a sense of
belonging, shared faith, trust, and interdependence—people can define community by
thinking they are a community. They conceptualize the same vision for what the community
is about, share the same mission statements and goals, and believe they are progressing as
a community towards the same end. In short, in terms many researchers use, they have a
shared purpose based on concepts that define the boundaries of the community. Sharing a
                                            152
               Foundations of Learning and Instructional Design Technology
purpose is slightly different from the affective concept of sharing faith in the existence of
the community and its ability to meet members’ needs. Community members may
conceptualize a vision for their community and yet not have any faith that the community is
useful (e.g., a member of a math community who hates math). Members may also disagree
on whether the community is capable of reaching the goal even though they may agree on
what the goal is (“my well intentioned study group is dysfunctional”). Thus conceptual
boundaries of a community of learners are distinct from relational ties; they simply define
ways members perceive the community’s vision. Occasionally the shared conception is the
most salient or distinguishing characteristic of a particular learning community.
Perhaps the most basic way to define the boundaries of a learning community is by what the
members do. For example, a community of practice in a business would include business
participants engaged in that work. This type of definition is often used in education which
considers students members of communities simply because they are doing the same
assignments: Participants’ associations are merely functional, and like work of research
teams organized to achieve a particular goal, they hold together as long as the work is held
in common. When the project is completed, these communities often disappear unless ties
related to relationships, conceptions, or physical or virtual presence [access] continue to
bind the members together.
                                             153
               Foundations of Learning and Instructional Design Technology
The importance of functional cohesion in a learning community is one reason why freshman
learning communities at universities usually place cohorts of students in the same classes so
they are working on the same projects. Considering work settings, Hakkarainen et al. (2004
[https://edtechbooks.org/-vmc]) argued that the new information age in our society requires
workers to be capable of quickly forming collaborative teams (or networked communities of
expertise) to achieve a particular functional purpose and then be able to disband when the
project is over and form new teams. They argued that these networked communities are
increasingly necessary to accomplish work in the 21st Century.
Conclusion
Many scholars and practitioners agree that learning communities “set the ambience for life-
giving and uplifting experiences necessary to advance an individual and a whole society”
(Lenning and Ebbers 1999 [https://edtechbooks.org/-MYC]). Because learning communities
are so important to student learning and satisfaction, clear definitions that enable sharing of
best practices are essential. By clarifying our understanding and expectations about what
we hope students will be able to do, learn, and become in a learning community, we can
more precisely identify what our ideal learning community would be like and distinguish this
ideal from the less effective/efficient communities existing in everyday life and learning.
In this chapter we have discussed definitions for four potential boundaries of a learning
community. Two of these can be observed externally: access (Who is present physically or
virtually?) and function (Who has been organized specifically to achieve some goal?). Two of
these potential boundaries are internal to the individuals involved and can only be
researched by helping participants describe their feelings and thoughts about the
community: relationships (Who feels connected and accepted?) and vision (who shares the
same mission or purpose?).
                                             154
               Foundations of Learning and Instructional Design Technology
Application Exercises
          Evaluate your current learning community. How can you strengthen your
          personal learning community? Make one commitment to accomplish this
          goal.
          Analyze an online group (Facebook users, Twitter users, NPR readers,
          Pinners on Pinterest, etc.) that you are part of to determine if it would fit
          within the four proposed boundaries of a community. Do you feel like an
          active member of this community? Why or why not?
References
Baker, S., & Pomerantz, N. (2000). Impact of learning communities on retention at a
metropolitan university. Journal of College Student Retention, 2(2), 115–126.CrossRef
[https://edtechbooks.org/-ZVf]Google Scholar [https://edtechbooks.org/-GP]
Blanchard, A. L., Welbourne, J. L., & Boughton, M. D. (2011). A model of online trust.
Information, Communication & Society, 14(1), 76–106. doi:10.1207/s15327752jpa8502
[https://edtechbooks.org/-Yia].CrossRef [https://edtechbooks.org/-Ywk]Google Scholar
[https://edtechbooks.org/-jL]
Booth, S. E. (2012). Cultivating knowledge sharing and trust in online communities for
educators. Journal of Educational Computing Research, 47(1), 1–31.CrossRef
[https://edtechbooks.org/-ZMe]Google Scholar [https://edtechbooks.org/-XM]
Borup, J., West, R., Thomas, R., & Graham, C. (2014). Examining the impact of video
feedback on instructor social presence in blended courses. The International Review of
Research in Open and Distributed Learning, 15(3). Retrieved from
https://edtechbooks.org/-xR.
Busher, H. (2005). The project of the other: Developing inclusive learning communities in
schools. Oxford Review of Education, 31(4), 459–477. doi:10.1080/03054980500222221
[https://edtechbooks.org/-Jwt].CrossRef [https://edtechbooks.org/-Jwt]Google Scholar
[https://edtechbooks.org/-JN]
Cavanagh, M. S., & Garvey, T. (2012). A professional experience learning community for
pre-service secondary mathematics teachers. Australian Journal of Teacher Education,
37(12), 57–75.CrossRef [https://edtechbooks.org/-pMu]Google Scholar
[https://edtechbooks.org/-AL]
Chen, Y., Yang, J., Lin, Y., & Huang, J. (2007). Enhancing virtual learning communities by
                                             155
               Foundations of Learning and Instructional Design Technology
finding quality learning content and trustworthy collaborators. Educational Technology and
Society, 10(2), 1465–1471. Retrieved from https://edtechbooks.org/-Ss.
Davies, A., Ramsay, J., Lindfield, H., & Couperthwaite, J. (2005). Building learning
communities: Foundations for good practice. British Journal of Educational
Technology,36(4), 615–628. doi:10.1111/j.1467-8535.2005.00539.x
[https://edtechbooks.org/-oW].CrossRef [https://edtechbooks.org/-oW]Google Scholar
[https://edtechbooks.org/-yb]
Day, G., & Murdoch, J. (1993). Locality and community: Coming to terms with place. The
Sociological Review, 41, 82–111.CrossRef [https://edtechbooks.org/-ked]Google Scholar
[https://edtechbooks.org/-QnR]
Hakkarainen, K., Palonen, T., Paavola, S., & Lehtinen, E. (2004). Communities of networked
expertise: Professional and educational perspectives. San Diego, CA: Elsevier.Google
Scholar [https://edtechbooks.org/-Dho]
Kay, D., Summers, J., & Svinicki, M. (2011). Conceptualizations of classroom community in
higher education: Insights from award winning professors. Journal of Ethnographic &
Qualitative Research, 5(4), 230–245.Google Scholar [https://edtechbooks.org/-Jy]
Kensler, L. A. W., Caskie, G. I. L., Barber, M. E., & White, G. P. (2009). The ecology of
democratic learning communities: Faculty trust and continuous learning in public middle
schools. Journal of School Leadership, 19, 697–735.Google Scholar
[https://edtechbooks.org/-sy]
Lenning, O. T., & Ebbers, L. H. (1999). The powerful potential of learning communities:
Improving education for the future, Vol. 16, No. 6. ASHE-ERIC Higher Education
Report.Google Scholar [https://edtechbooks.org/-VV]
                                            156
               Foundations of Learning and Instructional Design Technology
Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integration model of organizational
trust. Academy of Management Review, 20, 709–734. doi:10.5465/amr.2007.24348410
[https://edtechbooks.org/-htr].Google Scholar [https://edtechbooks.org/-vMJ]
McMillan, D. W., & Chavis, D. M. (1986). Sense of community: A definition and theory.
Journal of Community Psychology, 14(1), 6–23.CrossRef [https://edtechbooks.org/-
nbh]Google Scholar [https://edtechbooks.org/-ZZB]
Rovai, A. P., Wighting, M. J., & Lucking, R. (2004). The classroom and school community
inventory: Development, refinement, and validation of a self-report measure for educational
research. Internet & Higher Education, 7(4), 263–280.CrossRef
[https://edtechbooks.org/-GYc]Google Scholar [https://edtechbooks.org/-dch]
Schrum, L., Burbank, M. D., Engle, J., Chambers, J. A., & Glassett, K. F. (2005). Post-
secondary educators’ professional development: Investigation of an online approach to
enhancing teaching and learning. Internet and Higher Education, 8, 279–289.
doi:10.1016/j.iheduc.2005.08.001 [https://edtechbooks.org/-kQM].CrossRef
[https://edtechbooks.org/-kQM]Google Scholar [https://edtechbooks.org/-xnF]
Shen, D., Nuankhieo, P., Huang, X., Amelung, C., & Laffey, J. (2008). Using social network
analysis to understand sense of community in an online learning environment. Journal of
Educational Computing Research, 39(1), 17–36. doi:10.2190/EC.39.1.b
[https://edtechbooks.org/-kWn].CrossRef [https://edtechbooks.org/-kWn]Google Scholar
[https://edtechbooks.org/-Sxq]
Strike, K. A. (2004). Community, the missing element of school reform: Why schools should
                                             157
               Foundations of Learning and Instructional Design Technology
be more like congregations than banks. American Journal of Education, 110(3), 215–232.
doi:10.1086/383072 [https://doi.org/10.1086/383072].CrossRef
[https://doi.org/10.1086/383072]Google Scholar [https://edtechbooks.org/-joh]
Weissman, E., Butcher, K. F., Schneider, E., Teres, J., Collado, H., & Greenberg, D. (2011).
Learning communities for students in developmental math: Impact studies at
Queensborough and Houston Community Colleges. National Center for Postsecondary
Research. Retrieved from https://edtechbooks.org/-xF.
Zhao, C.-M., & Kuh, G. D. (2004). Adding value: Learning communities and student
engagement. Research in Higher Education, 45(2), 115–138.
doi:10.1023/B:RIHE.0000015692.88534.de [https://edtechbooks.org/-IR].CrossRef
[https://edtechbooks.org/-IR]Google Scholar [https://edtechbooks.org/-wV]
Suggested Citation
                                             158
Foundations of Learning and Instructional Design Technology
                           159
                           Richard E. West
He tweets @richardewest, and his research can be found on Google Scholar and
his website: http://richardewest.com.
                                       160
                        Gregory S. Williams
                                      161
                                        14
            Communities of Innovation
Individual, Group, and Organizational Characteristics Leading to
                Greater Potential for Innovation
Richard E. West
Editor’s Note
Video Abstract
                                         162
                Foundations of Learning and Instructional Design Technology
Introduction
In 1950, in a memorable presidential address to the American Psychological Association,
Guilford chided his colleagues for the period’s lack of research on creativity, noting that
only 0.2% of published articles in Psychology Abstracts had discussed creativity. He then
made a prescient prediction about the future, with the development of computers, which he
called “thinking machines”:
                                                   163
               Foundations of Learning and Instructional Design Technology
The time that Guilford envisioned is quickly becoming the present, when the combination of
powerful computers and the ability to network these computers through the Internet has
created a different kind of employment marketplace, one where employees are being
expected to produce innovations, where knowledge is not managed but created (Howkins,
2002; Sawyer, 2006a; Tepper, 2010). As a sign of the times, patents granted in the United
States have risen from about 49,000 in 1963 to over 276,000 in 2012 (U.S. Patent and
Trademark Office, 2012). Patent filings are, of course, not a perfect measure of innovation
for many reasons, but they reflect the current stress for innovation in business and industry.
Creativity in Education
Responding to this market need, educational organizations find it increasingly critical to
develop creativity in their students. For example, the Partnership for 21st Century Skills has
designated innovation as one of the skills students need (see https://edtechbooks.org/-nt).
Livingston (2010) argued, “Higher education needs to use its natural resources in ways that
develop content knowledge and skills in a culture infused at new levels by investigation,
cooperation, connection, integration, and synthesis. Creativity is necessary to accomplish
this goal” (p. 59).
How are we doing at teaching this critical capability? Not as well as we perhaps should be.
Berland (2012) surveyed 1,000 adult working college graduates in the United States and
found that 78% felt creativity to be important to their current career, and 82% wished they
had been more exposed to creative thinking in school. In addition, 88% felt creativity should
be integrated into university curricula, with 71% thinking it should be a class in itself.
Particularly interesting is the work done by Kyung Hee Kim, who in 2011 published an
influential article on the “Creativity Crisis” in the prestigious Creativity Research Journal.
Kim reported that results from the Torrance Test of Creative Thinking (TTCT), widely used
to measure creative and gifted abilities in children, had dropped significantly since 1990 on
nearly all of its subscales, which represent the qualities of creative thinking defined by
Torrance in his extensive work on the topic.
                                             164
               Foundations of Learning and Instructional Design Technology
of practice (Lave & Wenger, 1991; Wenger, 1998). Since publishing my 2009 paper, I have
been seeking to research and develop this framework. I am still in this process, but the
purpose of this paper is to update the framework with currently expanded knowledge and
experience.
In this paper I will explain what I see as some of the core attributes of COIs at each level,
including what we know from research about each attribute. The following section will
consider characteristics of Communities of Innovation in the categories of general
characteristics influenced by social creativity and learning, characteristics significant on the
level of individual groups, and characteristics necessary on the organizational level.
Hacker has typically been used to describe “illicit computer intruders” (Jordan & Taylor,
1998, p. 757), but more recently the word has been expanded beyond computer
                                              165
               Foundations of Learning and Instructional Design Technology
Computer programmers have responded to this type of deep, intrinsic motivation when they
have developed open source tools like Linux, Apache, and Wikipedia and given them away
without charge, being motivated not by money but by the challenge and the opportunity to
produce something that improves their lives and society. Even though the motivation is not
financial, people exhibiting the hacker ethic can produce amazingly creative products. As
Raymond (2003) said:
      To do the Unix philosophy right, you have to be loyal to excellence. You have to
      believe that software design is a craft worth all the intelligence and passion you
      can muster. . . . You need to care. You need to play. You need to be willing to
      explore. (p. 27)
One application of hacker motivation to creativity has been involving users to produce
innovative consumer products. Jeppesen and Frederiksen (2006) reported that in various
industries producing everything from electronics to computers to chemical
processes/equipment, 11-76% of the innovation in the field came from actual users, not
professionals, and that often products developed by collaborating lead users have been
many times better than products generated in house (Lilien, Morrison, Searls, Sonnack, &
von Hippel, 2003). Many companies have realized the power of hacker motivation and have
tried to foster it with their employees by granting autonomy, resources, and access to
collaborators for employees working on intrinsically motivating projects. Often these
projects become some of the most creative products in the company. For example, Google
has allowed its employees to work one day each week on their own intrinsically motivating
projects, and from this hacker time have come AdSense, Gmail, Google Talk, Google News,
and Google Reader.
Dynamic Expertise
Dynamic expertise, a term coined by Hakkarainen, Palonen, Paavola, & Lehtinen (2004),
contrasts with traditional views of expertise as an accumulation of skills and knowledge in a
particular domain. Dynamic expertise designates the ability to continually learn and surpass
earlier achievements by “living on the edge” (Marianno & West, 2013) of one’s competence,
pushing for new expertise in ever-evolving new ways and domains. Thus expertise is a
                                             166
               Foundations of Learning and Instructional Design Technology
dynamic, progressive ability to gain new skills and knowledge. In developing and validating
a survey to measure dynamic expertise in creative groups, Marianno and West (2013) found
three main relevant factors: awareness and understanding of the problems facing the group,
motivation to pursue these challenging problems, and ability to gain new competencies in
the process. In this study, groups in which the individual members exhibited more dynamic
expertise were significantly more innovative than their peers.
Developing and using dynamic expertise requires that members of a community have a
certain amount of entrepreneurship and autonomy. Gagne and Deci (2005) explained
autonomy as acting with choice and purpose and engaging in an activity because one finds it
enjoyable. McLean (2005) explained that freedom and autonomy within an organization will
likely promote intrinsic motivation and, consequently, innovation (see also Oldham &
Cummings, 1996). Similarly, scholars have found that promoting autonomy and self-directed
activity can substantially improve student morale, motivation, learning, and performance
(Gagne & Deci, 2005; Gelderen, 2010; Ryan & Deci, 2000). On the other hand, Amabile
(1996) found that perception of organizational control over its members impedes creativity.
This relationship is especially important when critiquing or evaluating the work within a
COI, as evaluation is critical to improving the product (West, Williams, & Williams, 2013),
but feedback must be given without the perception of limiting autonomy (Egan, 2005).
While members of a COI need to feel autonomy over how they accomplish their work, this
does not mean constraints should not be given or particular tasks assigned. In fact,
constraints are widely recognized for improving creativity to a degree (Dyer, Gregersen, &
Christensen, 2009; Moreau & Dahl, 2005). However, creativity flourishes when COI
members feel they have high autonomy and ownership over the everyday work, ideas, and
manner of discovering how to accomplish their tasks (Amabile, 1998; Amabile, Conti, Coon,
Lazenby, & Herron, 1996; Egan, 2005; Kurtzberg & Amabile, 2001). Supporting autonomy
can lead to the likelihood of group members internalizing and adopting the values and goals
of the group (Gagne & Deci, 2005).
Keith Sawyer, whose graduate adviser was Mihalyi Csikszentmihalyi, adapted his mentor’s
conception of flow (Csikszentmihalyi, 1990) to group collaboration. Sawyer (2008) explained
that group flow was more likely to occur based on 10 important elements of effective group
collaboration: a shared goal, close listening, complete concentration, the ability to be in
control (related to what I call autonomy), blended egos, equal participation, familiarity,
communication, effort to move ideas forward (often through improvisation, building on
previous ideas), and risk that comes from the potential for failure. Sawyer (2006b) argued
that when groups achieve flow, innovation is at its peak: “Performers are in interactional
                                            167
               Foundations of Learning and Instructional Design Technology
synchrony,” and “each of the group members can even feel as if they are able to anticipate
what their fellow performers will do before they do it” (p. 158).
Research into group flow is still in the early stages, and few use the term besides Sawyer,
but evidence has shown that Sawyer’s theory is solid. For example, Byrne, MacDonald, &
Carlton (2003; see also MacDonald, Byrne, & Carlton, 2006) studied how group flow
impacted creative output in musical compositions of 45 university students who were rated
for their creativity. The authors found a significant correlation between the levels of flow the
student groups experienced and the creativity of their group compositions.
The biggest challenge with group flow is how “fragile” (Armstrong, 2008) it is and how
difficult to foster. It is also “hard to predict in advance” (Sawyer, 2006b, p. 158), which
makes it difficult to research. Of particular interest to me is what happens when group
collaboration moves online. Sawyer (2013) has argued that the Internet cannot support
group flow at all, but more research is needed, including studies into whether group flow
might emerge online but require circumstances entirely different than those Sawyer
articulated for group flow in face-to-face settings.
Idea Prototyping
Design industries have long acknowledged the value of rapidly prototyping group ideas so
that collaboration can continue by improvising (Tripp & Bichelmeyer, 1990) on the design.
This significant application of the design thinking approach to group creativity is growing in
popularity in both industry and education because of its perceived ability to “change how
people learn and solve problems” (Razzouk & Shute, 2012, p. 331). Sutton and Kelley (1997)
noted that IDEO prototypes not only their products, but also their spaces, organizational
structures, and size—making prototyping a core feature of their successful approach to
innovation.
                                              168
               Foundations of Learning and Instructional Design Technology
Idea prototyping
Brown (2008) explained, “[T]he goal of prototyping isn’t to finish. It is to learn about the
strengths and weaknesses of the idea and to identify new directions that further prototypes
might take” (p. 87). Thus group members are able to learn through the process of creation,
which has been shown to be a powerful way to promote constructivist learning (Kafai &
Resnick, 1996).
Second, prototyping can facilitate group reflection by putting a concept into tangible form
for discussion. We have seen this in research into collaborative innovation at Brigham Young
University’s Center for Animation, as much of the innovation in this highly successful studio
emerges from group criticisms of designed prototypes in biweekly student-run meetings
(see West, Williams & Williams, 2013). Third, Sawyer (2003b) has argued that improvisation
is key to collaborative innovation, and prototyping can facilitate improvisation by providing
an initial concept to begin experimentation.
                                            169
                Foundations of Learning and Instructional Design Technology
Individuals with diverse perspectives in a group must freely share these diverse viewpoints
and ideas. Diversity can be inhibited by social constraints like hierarchies of power or even
personal constraints like shyness; efforts must be made to bring out the diversity of the
group. For example, research has found that traditional brainstorming does not produce
better creativity (Pauhus et al., 1993; Taylor, Berry, & Block, 1958) because groupthink can
emerge if a few individuals share opinions and the rest of the group is hesitant to challenge
or offer their own. More effective are methods, such as the nominal group technique (Mullin
et al., 1991; Putman & Paulus, 2009), which ask individuals to first do the hard work of
developing their ideas and positions individually or in smaller teams before sharing them in
an open, but critical and evaluative, collaboration where the ideas can be merged and
improvised upon.
An important quality of innovative communities is the ability of members to give and receive
criticism in productive ways. This capacity is due in large measure to organizational-level
efforts to support exploration and allow for failure with recoverability, as long as quality
reflection enables learning from the failure, thus making it actually “productive” (Kapur &
Rummel, 2012). As an organization creates a culture where failure is no longer devastating
to the team, then at the group level teams have a greater opportunity to develop skills in
critique, reflection, evaluation, and team learning.
One example of the role of critical evaluation and reflection in collaborative innovation was
the Center for Animation that we studied (West, Williams & Williams, 2013). In that setting,
evaluation was a top priority, and the design community met twice a week over a year and a
half to showcase and critique weekly progress on their animated short. We found that the
qualities that made evaluation successful in this community were the culture of high
expectations, collaboration, and evaluation; the ability of the instructors to unite the
students, teachers, and leaders as shared stakeholders in the success of the project; the
important criteria for evaluating progress; and the frequent opportunities to question and
                                               170
               Foundations of Learning and Instructional Design Technology
In an earlier study (West & Hannafin, 2011), I learned that often the act of critiquing
another’s work not only helps the person receiving the evaluation, but also the one giving it.
One student in that study explained how she and her peers learned through the process of
critique, quoting Nelson & Stolterman (2003): “[I]t is also possible to develop design skills
by critiquing existing designs” (p. 217)
Common Vision
Essential to the ability of a group to collaborate and critique their progress effectively is
that they have a common vision of what they are trying to do. This does not mean they know
exactly what the design will look like, but only what they hope the design will accomplish.
Anderson and West (1998) explained that a group’s shared vision is more effective when it is
clear and understandable, is important to and widely shared by all members of the group,
and is attainable so it is not demotivating. The importance of a common vision to a
productive team climate has been shown in both business (Anderson & West, 1998) and
education (West, Williams, & Williams, 2013). Wang & Rafiq (2009) explained the tension in
organizational learning between paradigms of exploration and exploitation, and argued that
organizational diversity and shared vision are vital to balancing these competing views of
group productivity.
Many scholars in organizational studies argue that a flexible organizational structure can
promote innovation in a community. For example, Volberda (1996) argued, “Bureaucratic
vertical forms severely hamper the ability to respond to accelerating competition. Flexible
forms, in contrast, can respond to a wide variety of changes in the competitive environment
in an appropriate and timely way” (p. 359). A classic example is the organizational structure
of IDEO. In a 2001 interview with Businessweek, Beth Strong, IDEO’s Director of
Recruiting, explained that IDEO’s organizational structure is “very flat” where “hot teams”
can form on their own and work as a studio for a period of time to complete a project that
the team members are all excited about. There is no expectation of an entire career within
one studio, and movement between studios is encouraged, with leadership within the
studios often being organic—emerging from within the group.
This type of organizational structure is radically different from that of many communities of
practice. Some research has argued that the type of organizational structure is less
important than expected, and that flat organizations can struggle with inefficiency due to
interpersonal conflicts and inadequate effort coordination (Carzo & Yanouzas, 1969).
Possibly what matters more than tall vs. flat organizational structure are characteristics of
that organization, such as how quickly innovative ideas can be approved for prototyping,
                                             171
               Foundations of Learning and Instructional Design Technology
how much autonomy individuals and groups have for innovating, and how flexible the
organization is in reorganizing teams according to emergent needs and situations.
Pink (2011) popularized the idea that higher-order thinking tasks, such as creativity, are
best motivated by organizations that promote mastery, purpose, and autonomy in
employees. His ideas are based in large part on the work of Teresa Amabile of Harvard, who
has found in her research that “when it comes to granting freedom, the key to creativity is
giving people autonomy concerning the means . . . but not necessarily the ends” of a task
(1998, p. 81) or, in other words, “choice in how to go about accomplishing the tasks that
they are given” (Amabile, Conti, Coon, Lazenby, & Herron, 1996; see also Kurtzberg &
Amabile, 2001). This finding holds true not only in business settings but in education
(Gelderen, 2010) and research, where Parker & Hackett (2012) explained that research
groups benefit from providing younger investigators autonomy, allowing them to be a group
that is “getting-big-while-remaining-small” (p. 38): in other words, maintaining their
entrepreneurial creativity.
An organization’s focus on individuals and groups working towards mastery and purpose in
their work can also increase motivation, often more effectively than extrinsic rewards, which
have been shown in many research studies to diminish creativity (Hennessey, 1989) and
damage intrinsic motivation (Deci, Koestner, & Ryan, 1999). For this reason many
innovative design companies encourage lifelong learning for their employees, even in areas
not directly related to their work (consider, for example, Pixar University), and to work on
projects that give them a sense of purpose, so they feel they are accomplishing a greater
good (see previous discussion on the importance of fostering a hacker ethic).
The glue that unifies any community, particularly one with the differences in characteristics
and structures of a community of innovation, is a strong sense of community and
psychological safety among the members. Rogers (1954), well known for articulating the
importance of psychological safety for creativity, explained that psychological safety
depends on three separate processes: (1) accepting the individual as of unconditional worth,
(2) providing a climate in which external evaluation is absent* [#_ftn1]and (3) empathically
understanding the individual (referred to by Sawyer [2008] as close listening). Since Rogers’
work, many scholars have found evidence for the importance of a strong sense of community
in education units (Rovai, 2002; West & Hannafin, 2011), work teams (Barczak, Lassk, &
Mulki, 2010), and whole organizations (Baer & Frese, 2003).
                                            172
               Foundations of Learning and Instructional Design Technology
Teaching in a way that builds communities of innovation is not easy, but it is increasingly
important. Like many higher order skills, collaborative innovation skills are best taught
through modeling, nurturing, and supporting students’ growth in ways specific to every
context and group of individuals. Still the community of innovation characteristics outlined
in this paper seem to lead to some suggested strategies.
First, our research in online learning needs to transition from a predominant focus on
delivering content and testing information recall (I’m looking at you, MOOCs) and more on
how to recapture the powerful improvisational and impromptu conversations and
interactions that lead to group innovation. Tools like Mural.ly (https://mural.ly/), Mendley
(http://mendeley.com; see Zaugg, West, Tateishi, & Randall, 2011), and Chatter
(https://edtechbooks.org/-Dr) are examples of the kinds of collaboration tools we need that
foster people and ideas “bumping into each other” in unforeseen ways to foster innovation.
Second, we need to foster idea generation in effective ways by encouraging individual work
and contribution first and then group evaluation and improvisation/prototyping afterward.
We will have more group genius (Sawyer, 2008) instead of groupthink when we use
strategies that utilize the diversity within a group and encourage open and critical dialogue
in an atmosphere of psychological safety.
Third, one of our primary goals in education should be to encourage group flow, which is
where the magic of collaborative innovation happens. This means focusing less on seat time
and more on project goals. Studio-based approaches to teaching (Chen & You, 2010; Clinton
& Rieber, 2010; Docherty, Sutton, Brereton, & Kaplan, 2001) work well because they tend
to de-emphasize time on task in favor of work completed and creativity developed. Nothing
disrupts a group’s flow worse than having the bell ring for the end of class. Instead, we
should encourage students to work together in ways and on projects that are most likely to
lead to flow, and when they are doing so effectively, we need to give them the space and
time to keep it going!
                                             173
               Foundations of Learning and Instructional Design Technology
First, we need more concrete definitions and methods for measuring/observing the COI
principles outlined in this paper, as well as any others that may also be important to
collaborative innovation, using as many different research methods as possible. Although
traditional creativity scholars have largely rejected qualitative methods, too much is still
unknown about how to foster collaborative innovation for us to not use every potentially
useful research method, including quantitative, qualitative, conversation analysis, and social
network analysis.
Second, education is rapidly changing and transitioning towards online and blended
environments. While this transition is clearly important and can provide many benefits, we
need to be careful that we do not focus on what is easier to teach online (information)
instead of what is more difficult but also important (collaboration, creativity, and critical
thinking). Instructional designers and researchers need to lead out on setting the agenda for
online education in ways that theory suggests will lead to better learning.
Third, we need to explore how to teach collaborative innovation skills on various educational
levels. Most of the current research focuses on higher education, for example, and tight
national standards for grade-school education often make it harder to justify spending time
on skills such as creativity that do not readily show up on standardized tests. Still there is
room in national standards for creativity, particularly in the upsurge of interest in teaching
engineering practices to children. More research is needed on how to infuse group
creativity into this type of curriculum effectively.
Unfortunately, education administrators’ and leaders’ talk about teaching creativity is often
little more than “rhetorical flourishes in policy documents and/or relegated to the
borderlands of the visual and performing arts” (McWilliam & Dawson, 2008, p. 634),
perhaps because this capability is among the most “elusive” (p. 633) of skills. However, the
scholar considered by many to be the father of creativity, E. Paul Torrance, encouraged
creative persons to seek great teachers and mentors in their quest to develop their
creativity (Torrance, 2002). As educators and instructional designers we are responsible to
be those teachers and mentors as we design the kinds of learning environments that best
foster creativity and innovation, especially in collaborative communities.
                                             174
               Foundations of Learning and Instructional Design Technology
Application Exercises
          71% of students surveyed by Berland (2012) felt that universities should offer
          a class on creativity. Using some of the guidelines and information from this
          chapter, create an outline of what you think a class on creativity would look
          like.
          Consider an organization that you are a part of. What are the ways in which
          you could integrate principles of communities of innovation?
          What is one thing you would do to create group flow in an online learning
          environment?
References
Amabile, T. M. (1996). Creativity in context. Boulder, CO: Westview.
Amabile, T. M. (1998). How to kill creativity. Harvard Business Review, 76(5), 77–87.
Amabile, T. M., Conti, R., Coon, H., Lazenby, J., & Herron, M. (1996). Assessing the Work
Environment for Creativity. The Academy of Management Journal, 39(5), 1154–1184.
Anderson, N. R., & West, M.A.. (1998). Measuring climate for work group innovation:
Development and validation of the team climate inventory. Journal of Organizational
Behavior, 19(3), 235–258.
Armstrong, A. (2008). The fragility of group flow: The experiences of two small groups in a
middle school mathematics classroom. The Journal of Mathematical Behavior, 27(2),
101–115.
Baer, M., & Frese, M. (2003). Innovation is not enough: Climates for initiative and
psychological safety, process innovations, and firm performance. Journal of Organizational
Behavior, 24(1), 45–68. doi:10.1002/job.179
Barczak, G., Lassk, F., & Mulki, J. (2010). Antecedents of team creativity: An examination of
team emotional intelligence, team trust and collaborative culture. Creativity and Innovation
Management, 19(4), 332–345. doi:10.1111/j.1467-8691.2010.00574.x
                                             175
               Foundations of Learning and Instructional Design Technology
Black, A. E., & Deci, E. L. (2000). The effects of instructors’ autonomy support and students’
autonomous motivation on learning organic chemistry: A self-determination theory
perspective. Science Education, 84(6), 740–756.
Byrne, C., MacDonald, R., & Carlton, L. (2003). Assessing creativity in musical
compositions: Flow as an assessment tool. British Journal of Music Education, 20(3),
277–290. Retrieved from http://www.journals.cambridge.org/abstract_S0265051703005448
Carzo, R., & Yanouzas, J. N. (1969). Effects of flat and tall organization structure.
Administrative Science Quarterly, 14(2), 178–191. doi:10.2307/2391096
Csikszentmihályi, M. (1990). Flow: The psychology of optimal experience. New York, NY:
HarperCollins.
Chance, T. (2005). The hacker ethic and meaningful work. Retrieved from
http://www.acrewoods.net/free-culture/the-hacker-ethic-and-meaningful-work
Chen, W., & You, M. (2010). Internet mediated industrial design studio course: The students
’ responses. International Journal of Technology and Design Education, 20(2), 151–174.
doi:10.1007/s10798-008-9068-2
Clinton, G., & Rieber, L. P. (2010). The studio experience at the University of Georgia: An
example of constructionist learning for adults. Educational Technology Research and
Development, 58(6), 755–780. doi:10.1007/s11423-010-9165-2
Deci, E. L., Koestner, R., & Ryan, R. M. (1999). A meta-analytic review of experiments
examining the effects of extrinsic rewards on intrinsic motivation. Psychological Bulletin,
125(6), 627–668.
Docherty, M., Sutton, P., Brereton, M., & Kaplan, S. (2001). An innovative design and studio-
based CS degree. Proceedings of the thirty-second SIGCSE technical symposium on
Computer Science Education—SIGCSE ’01, 33(1), 233–237. doi:10.1145/364447.364591
Dyer, J. H., Gregersen, H. B., & Christensen, C. M. (2009). The innovator’s DNA. Harvard
Educational Review, 87(12), 61–67.
                                              176
               Foundations of Learning and Instructional Design Technology
Gagne, M., & Deci, E. L. (2005). Self-determination theory and work motivation. Journal of
Organizational Behavior, 26(4), 331–362. doi:10.1002/job.322
Granovetter, M. S. (1973). The strength of weak ties. American Journal of Sociology, 78,
1360–1380.
Hakkarainen, K., Palonen, T., Paavola, S., & Lehtinen, E. (2004). Communities of networked
expertise: Professional and educational perspectives. Amsterdam, NL: Elsevier.
Himanen, P. (2001). The hacker ethic: A radical approach to the philosophy of business.
New York, NY: Random House.
Howkins, J. (2002). The creative economy: How people make money from ideas. London,
UK: Penguin UK.
Jordan, T., & Taylor, P. (1998). A sociology of hackers. Sociological Review, 46(4), 757–780.
Jeppesen, L. B., & Frederiksen, L. (2006). Why do users contribute to firm-hosted user
communities? The case of computer-controlled music instruments. Organization Science,
17(1), 45–63.
Kafai, Y. B., & Resnick, M. (Eds.). (1996). Constructionism in practice: Designing, thinking,
and learning in a digital world. Mahwah NJ: Lawrence Erlbaum Associates.
Kapur, M., & Rummel, N. (2012). Productive failure in learning from generation and
invention activities. Instructional Science, 40(4), 645–650.
Kim, K. H. (2011). The creativity crisis: The decrease in creative thinking scores on the
Torrance Tests of Creative Thinking. Creativity Research Journal, 23(4), 285–295.
Kurtzberg, T. R., & Amabile, T. M. (2001). From Guilford to creative synergy: Opening the
black box of team-level creativity. Creativity Research Journal, 13(3–4), 285–295.
doi:10.1207/S15326934CRJ1334_06
                                             177
               Foundations of Learning and Instructional Design Technology
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation.
Cambridge, UK: Cambridge University Press.
Lilien, G., P. D., Morrison, K., Searls, M., Sonnack, E., & von Hippel. (2003). Performance
assessment of the lead user generation process for new product development. Management
Science, 48(8) 1042–1060.
Livingston, L. (2010). Teaching creativity in higher education. Arts Education Policy Review,
111(2), 59–62.
MacDonald, R., Byrne, C., & Carlton, L. (2006). Creativity and flow in musical composition:
An empirical investigation. Psychology of Music, 34(3), 292–306.
doi:10.1177/030573560606483
Marianno, B., & West, R. E. (2014). Living on the edge: Expanding Individual Competencies
in Innovative Student Teams by Developing Dynamic Expertise. Manuscript submitted for
publication.
McWilliam, E., & Dawson, S. (2008). Teaching for creativity: Towards sustainable and
replicable pedagogical practice. Higher Education, 56(6), 633–643.
doi:10.1007/s10734-008-9115-7
Moreau, C. P., & Dahl, D. W. (2005). Designing the solution: The impact of constraints on
consumers’ creativity. Journal of Consumer Research, 32(1), 13–22.
Mullen, B., Johnson, C., & Salas, E. (1991). Productivity loss in brainstorming groups: A
meta-analytic integration. Basic and Applied Social Psychology, 12(1), 3–23.
Nelson, H. G., & Stolterman, E. (2003). The design way. Englewood Cliffs, NJ: Educational
Technology Publications.
Oldham, G. R., & Cummings, A. (1996). Employee creativity: Personal and contextual factors
at work. Academy of Management Journal, 39(3), 607–634.
Parker, J. N., & Hackett, E. J. (2012). Hot spots and hot moments in scientific collaborations
and social movements. American Sociological Review, 77(1), 21–44.
doi:10.1177/0003122411433763
Pauhus, P. B., Dzindolet, M. T., Poletes, G., & Camacho, L. M. (1993). Perception of
performance in group brainstorming: The illusion of group productivity. Personality and
Social Psychology Bulletin, 19(1), 78–89.
                                             178
               Foundations of Learning and Instructional Design Technology
Pink, D. H. (2011). Drive: The surprising truth about what motivates us. New York, NY:
Riverhead Books.
Putman, V. L., & Paulus, P. B. (2009). Brainstorming, brainstorming rules and decision
making. The Journal of Creative Behavior, 43(1), 29–40.
Razzouk, R., & Shute, V. (2012). What is design thinking and why is it important? Review of
Educational Research. doi:10.3102/0034654312457429
Ryan, R. M. & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic
motivation, social development, and well-being. American Psychologist, 55(1), 68–78. doi:
10.1037/0003-066X.55.1.68
Sawyer, R. K. (2006a). Educating for innovation. Thinking Skills and Creativity, 1(1), 41–48.
Sawyer, R. K. (2008). Group genius: The creative power of collaboration. New York, NY:
Basic Books.
Sawyer, R. K. (2013). Telecommuting kills creativity: What the research says about Yahoo’s
new work policy. The Blog, Huffington Post. Retrieved from https://edtechbooks.org/-yv
Sutton, R. I., & Kelley, T. A. (1997). Creativity doesn’t require isolation: Why product
designers bring visitors “backstage.” California Management Review, 40(1), 75–92.
Taylor, D. W., Berry, P. C., & Block, C. H. (1958). Does group participation when using
brainstorming facilitate or inhibit creative thinking? Administrative Science Quarterly, 3(1),
23–47.
Tepper, S. J. (2002). Creative assets and the changing economy. The Journal of Arts
Management, Law, and Society, 32(2), 159–168.
                                              179
               Foundations of Learning and Instructional Design Technology
Wang, C. L., & Rafiq, M. (2009). Organizational diversity and shared vision: Resolving the
paradox of exploratory and exploitative learning. European Journal of Innovation
Management, 12(1), 86–101.
West, R. E., Williams, G. S., & Williams, D. D. (2013). Improving problem-based learning in
creative communities through effective group evaluation. Interdisciplinary Journal of
Problem-based Learning, 7(2). Retrieved from https://edtechbooks.org/-MT
Zaugg, H., West, R. E., Tateishi, I., & Randall, D. L. (2011). Mendeley: Creating communities
of scholarly inquiry through research collaboration. TechTrends, 55(1), 32–36.
* [#_ftnref1] I have previously argued for the importance of critique, but this critique must
not reflect on the individual itself, but rather on the project.
                                             180
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        181
                           Richard E. West
He tweets @richardewest, and his research can be found on Google Scholar and
his website: http://richardewest.com.
                                       182
                                           15
Introduction
Motivation has been defined as a desire or disposition to engage and persist in a task
(Schunk, Pintrich, & Meece, 2014). When a student wants to read a history book on the Civil
War, we can say that he or she is motivated to learn about American history. The student
may learn, however, of a TV program about his favorite singer and decide not to engage in
reading the history book on this particular day. Motivation thus refers to a state of being
moved to do something, a movement that drives a person’s behavior. Students without
motivation feel no impetus or inspiration to learn a new behavior and will not engage in any
learning activities.
Educational researchers have long recognized the role of motivation in learning and have
studied motivation from various perspectives. Their efforts have produced a rich foundation
of motivation theories. Early motivation theories reflected the traditional behaviorism
approach, an approach that considered the basis of motivation to be rewards and
punishments. Other theories looked at drives and needs. Over the last 30 years, however,
researchers have studied motivation primarily from a social cognitive approach. This
approach focuses on individuals’ beliefs and contextual factors that influence motivation.
This chapter provides a brief overview of the major social-cognitive theories of motivation
and discusses how the theories have informed the field of instructional design technology.
The chapter concludes by introducing several technology examples designed to enhance
student motivation.
Theories of Motivation
Expectancy-value Theory
Expectancy-value theory suggests that the two most immediate predictors of achievement
behaviors are expectancies for success and task value beliefs (Wigfield & Eccles, 2000).
Expectancies for success refer to students’ beliefs of whether they will do well on an
                                            183
                Foundations of Learning and Instructional Design Technology
upcoming task (Wigfield, 1992). The more students expect to succeed at a task, the more
motivated they are to engage with it. Such beliefs are closely related to but conceptually
distinguished from ability beliefs. Ability beliefs are defined as students’ evaluations of their
current competence at a given task. Ability beliefs are concerned with present ability
whereas expectancies for success are concerned with future potential.
Task value answers the question, “Why should I do this task?” There are four possible
answers to the question: intrinsic value, attainment value, utility value, and cost (Wigfield &
Eccles, 1992). Intrinsic value is pure enjoyment a student feels from performing a task.
When they are intrinsically interested in it, students are willing to become involved in a
given task. Attainment value refers to the importance of doing well on a task. Tasks are
perceived important when they reflect the important aspects of one’s self. Utility value is
the perception that a task will be useful for meeting future goals, for instance, taking a
Chinese class to get a job in China. The last component of task value, cost, refers to what an
individual has to give up to engage in a task or the effort needed to accomplish the task. If
the cost is too high, students will be less likely to engage in a given task. For instance,
students may not decide to take an extra course when they need to reduce the hours of their
part-time job.
Numerous studies have shown that students’ expectancies for success and subjective task
values positively influenced achievement behaviors and outcomes (Dennissen, Zarret, &
Eccles, 2007; Durik, Shechter, Noh, Rozek, & Harackiewicz, 2015; Wigfield & Eccles, 2000).
For example, Bong (2001) reported that college students’ perceived competence was a
significant predictor of their performance. Also, students’ perceived utility predicted future
enrollment intentions. These relations have been also found in online learning
environments. Joo, Lim, and Kim (2013) reported that perceived competence and task value
of students enrolled in an online university significantly predicted learner satisfaction,
persistence, and achievement.
Self-efficacy Theory
                                              184
               Foundations of Learning and Instructional Design Technology
(Wigfield & Eccles, 2000). For example, self-efficacy would not merely be a self-judgment of
being good at mathematics but rather feeling competent at correctly subtracting fractions.
Despite such conceptual differences, self-efficacy and expectancies for success are often
used interchangeably. Bandura (1997) also noted that self-efficacy is different from self-
confidence. Self-confidence is a belief about a person’s general capability that is not related
to a specific subject. In spite of demonstrations of high self-confidence, a person can fail to
accomplish a specific task.
Goal setting is a key motivational process (Locke & Latham, 1984). Goals are the outcome
that a person is trying to accomplish. People engage in activities that are believed to lead to
goal attainment. As learners pursue multiple goals such as academic goals and social goals,
goal choice and the level at which learners commit to attaining the goals influence their
motivation to learn (Locke & Latham, 2006; Wentzel, 2000).
Besides goal content (i.e., what a person wants to achieve), the reason that a person tries to
achieve a certain goal also has a significant influence on learning and performance. Goal
orientations refer to the reasons or purposes for engaging in learning activities and explain
individuals’ different ways of approaching and responding to achievement situations (Ames
& Archer, 1988; Meece, Anderman, & Anderman, 2006). The two most basic goal
orientations are mastery and performance goals (Ames & Archer, 1988). Different
researchers refer to these goals with the following terms: learning and performance goals
(Elliot & Dweck, 1988), task-involved and ego-involved goals (Nicholls, 1984), and task-
focused and ability-focused goals (Maehr & Midgley, 1991). A mastery goal orientation is
defined as a focus on mastering new skills, trying to gain increased understanding, and
improving competence (Ames & Archer, 1988). Students adopting mastery goals define
success in terms of improvement and learning. In contrast, a performance goal orientation
focuses on doing better than others and demonstrating competence, for example, by striving
to best others, using social comparative standards to make judgments about their abilities
                                              185
               Foundations of Learning and Instructional Design Technology
while seeking favorable judgment from others (Dweck & Leggett, 1988).
In addition to the basic distinction between mastery and performance goals, performance
goal orientations have been further differentiated into performance-approach and
performance-avoidance goals (Elliot & Church, 1997; Elliot & Harackiewicz, 1996).
Performance-approach goals represent individuals motivated to outperform others and
demonstrate their superiority, whereas a performance-avoidance goal orientation refers to
those who are motivated to avoid negative judgments and appearing inferior to others.
Incorporating the same approach and avoidance distinction, some researchers have further
distinguished mastery-approach and mastery-avoidance goals (Elliot & McGregor, 2001).
Mastery-approach goals are related to attempts to improve knowledge, skills, and learning.
In contrast, mastery-avoidance goals represent a focus on avoiding misunderstanding or the
failure to master a task. For instance, athletes who are concerned about falling short of
their past performances reflect a mastery-avoidance goal. Despite the confirmatory factor
analyses of the 22 goal framework (Elliot & McGregor, 2001; see Table 1), the mastery-
avoidance construct remains controversial and is in fact the least accepted construct in the
field.
Studies typically report that mastery-approach goals are associated with positive
achievement outcomes such as high levels of effort, interest in the task, and use of deep
learning strategies (e.g., Greene, Miller, Crowson, Duke, & Akey, 2004; Harackiewicz,
Barron, Pintrich, Elliot, & Thrash, 2002; Wolters, 2004). On the other hand, research on
performance-avoidance goals has consistently reported that these goals induced detrimental
effects, such as poor persistence, high anxiety, use of superficial strategies, and low
achievement (Linnenbrink, 2005; Urdan, 2004; Wolters, 2003, 2004). With regard to
performance-approach goals, the data have yielded a mix of outcomes. Some studies have
reported modest positive relations between performance-approach goals and achievement
(Linnenbrink-Garcia, Tyson, & Patall, 2008). Others have found maladaptive outcomes such
as poor strategy use and test anxiety (Keys, Conley, Duncan, & Domina, 2012; Elliot &
                                             186
                Foundations of Learning and Instructional Design Technology
McGregor, 2001; Middleton & Midgley, 1997). Taken together, these findings suggest that
students who adopt performance-approach goals demonstrate high levels of achievement
but experience negative emotionality such as test anxiety. Mastery-avoidance goals are the
least studied goal orientation thus far. However, some studies have found mastery-
avoidance to be a positive predictor of anxiety and a negative predictor of performance
(Howell & Watson, 2007; Hulleman, Schrager, Bodmann, & Harackiewicz, 2010).
Attribution Theory
Attribution theory considers the source of people’s motivation to be their perception of why
they succeeded or failed. The theory assumes that people try to understand causal
determinants of their own success and failures (Weiner, 1986). For example, people may
attribute their success (or failure) to ability, effort, luck, task difficulty, mood, fatigue, and
so on. These perceived causes of outcomes are called attributions (Weiner, 1986).
Attributions may or may not be actual causes, and regardless of actual causes of the event,
the perceived causes are what drive individuals’ motivation and behaviors.
According to Weiner (2010), attributed causes for success and failure can be classified along
three dimensions: locus, stability, and controllability. The locus dimension concerns the
location of the cause, or whether a cause is within or outside the individual. Effort is
internal to the learner, for example, whereas luck is external. The stabilitydimension refers
to whether or not the cause is constant. Effort and luck are unstable because they can vary
across situations, whereas ability is regarded as relatively stable. Lastly, the controllability
dimension concerns how much control an individual has over a cause. Learners can control
effort but not luck or task difficulty.
The conceptual classification of causes for success and failure based on the three
dimensions is central to the attribution theory of motivation because each dimension is
related to a set of motivational, affective, and behavioral consequences. Locus of causality,
for example, influences learners’ self-esteem and esteem-related emotions (Weiner, 1986).
When a successful outcome is attributed to internal causes (e.g., ability, effort) and not
external causes (e.g., luck), the students are more likely to take pride in the success and
their self-esteem tends to be heightened. On the other hand, failure attributed to internal
causes usually results in feelings of shame or guilt and a lowering of self-esteem.
The stability dimension influences individuals’ expectancy for future success (Weiner, 1986).
If success is attributed to a stable cause, one will expect to have the same outcome in the
future. Failure attributed to a stable cause (e.g., low ability) will lower one’s expectancy for
future success unless he or she believes the ability can and will increase. Attribution for
failure to an unstable cause (e.g., “I did not try hard enough”) allows students to expect the
outcome could change—as long as they put forth enough effort, they could succeed next
time.
The controllability dimension is also related to self-directed emotions (Weiner, 1986). When
failure is attributed to a controllable cause (e.g., effort), one is likely to experience guilt and
                                               187
               Foundations of Learning and Instructional Design Technology
the desire to alter the situation. One will experience a feeling of shame or humiliation when
failure is attributed to causes that are internal and uncontrollable (e.g., low ability). When
attribution for failure is made to the causes that are external and uncontrollable, one is
likely to feel helpless and depressed because he or she believes that nothing can change the
situation. Thus, failure attribution to uncontrollable causes tends to decrease motivation and
engagement.
Self-determination Theory
The two basic types of motivation are intrinsic motivation and extrinsic motivation (Ryan &
Deci, 2000). Intrinsic motivation refers to a disposition to engage in a task for one’s inner
pleasure. An example of intrinsic motivation is a student reading a history textbook for fun.
It is human nature for people to engage in activities that they are intrinsically interested in.
Intrinsic motivation often leads to high levels of engagement and performance (Deci &
Ryan, 2000).
According to the theory, intrinsic motivation emerges spontaneously from satisfying the
basic psychological needs of autonomy, competence, and relatedness (Deci & Ryan, 1985).
Autonomy is the psychological need to experience one’s behaviors as volitional and is self-
endorsed. It is closely related to a feeling of freedom to determine one’s own behaviors. For
example, choice over one’s actions can satisfy the need for autonomy; a feeling of autonomy
can be undermined, however, by external rewards and threats (Deci & Ryan, 2000).
Competence is the psychological need to feel efficacious in one’s pursuits of goals. A feeling
of competence is facilitated by optimal challenges and positive feedback (Ryan & Deci,
2000). Relatedness refers to the inherent desire to experience a feeling of being connected
to others. The need for relatedness is satisfied by feeling respected and cared for.
Although it is clear that intrinsic motivation promotes learning, most learning activities are
not intrinsically interesting to students. Students are often motivated to engage in an
activity because it is instrumental to some outcomes separated from the activity itself, which
indicates extrinsic motivation. An example of extrinsic motivation is a student who reads a
history book for the exam in order to get good grades. In general, it is understood that
because an action enacted by extrinsic motivation is controlled by an external factor, it
leads to less productive learning behaviors and low-quality engagement compared to
learning behaviors that ensue from intrinsic behaviors. However, self-determination theory
asserts that extrinsic motivation is a differentiated construct. Extrinsic motivation can
                                              188
                Foundations of Learning and Instructional Design Technology
represent inner sources of an action and result in high-quality learning behaviors. The
theory proposes four types of extrinsic motivation—external, introjected, identified, and
integrated. These differ according to the degree to which the motivation is self-determined
or autonomous (Ryan & Deci, 2000). The more autonomous a motivation is, the higher
quality of engagement students demonstrate.
External motivation, located at the far left of the extrinsic motivation continuum in Figure 1,
is characterized by behaviors enacted to achieve a reward or avoid a punishment. An
example of external motivation is a student who skims a history book before an exam only to
get good grades. Introjected motivation refers to behaviors performed to maintain a feeling
of self-worth or to avoid a feeling of guilt. This type of motivation is still less autonomous
because the behaviors are associated with an external locus of causality (e.g., pressure and
obligation). On the other hand, identified motivation represents an autonomous type of
extrinsic motivation. This type of motivation is signified when an individual perceives the
value of an activity and considers it to be personally relevant. Finally, the most autonomous,
self-determined form of extrinsic motivation is integrated motivation, which occurs when the
identified value of an activity is fully integrated with a part of the self. Integrated regulation
is similar to intrinsic motivation in terms of its degree of self-determination, though the two
motivational constructs conceptually differ in their source of motivation. Integrated
regulation is based on the internalized importance of the activity, whereas intrinsic
motivation is based on inherent interest in the activity.
                                               189
               Foundations of Learning and Instructional Design Technology
motivation. The theory explains how to motivate students to carry out learning tasks that
are not inherently interesting. The theory specifies three psychological needs—autonomy,
competence, and relatedness—as the basis for sustaining intrinsic motivation and more self-
determined extrinsic motivation. To the extent that students internalize and integrate
external regulations and values, they experience greater autonomy and demonstrate high-
quality engagement in learning activities.
The most well-known antecedent of motivation is probably interest. We often see students
saying that they do not learn because classes are boring and they are not interested in the
topic. While we generally refer to “feeling of enjoyment” as interest in everyday language,
researchers have differentiated interest into two types—individual (personal) and
situational. Individual interest is a relatively enduring and internally driven disposition of
the person that involves enjoyment and willingness to reengage with a certain object over
time (Hidi & Renninger, 2006; Krapp, 2005; Schiefele, 1991). Schiefele (2001)
conceptualized individual interest as including both positive feelings (e.g., enjoyment) and
the value-related belief that the object is personally important. Situational interest, on the
other hand, refers to a temporary psychological state aroused by contextual features in the
learning situation (Hidi & Renninger, 2006; Schiefele, 2009). When a student is lured by a
catchy title to a news article, his or her interest is triggered by the environmental stimuli.
Individual interest can also be supported by a particular situation, but it continues to be
present without the situational cues.
Hidi and Renninger (2006) proposed a four-phase model of interest development describing
how interest develops from transient situational interest into stable individual interest. In
the first phase, situational interest is sparked by environmental features such as novel,
incongruous, or surprising information, which is called triggered situational interest.
Triggered situational interest provokes attention and arousal only in the short term. The
second phase is referred to as maintained situational interest, which involves focused
attention and persistence over a longer period of time. Situational interest is sustained
when a person finds the meaningfulness of tasks or personal connections to the tasks. Only
maintained situational interest can develop into long-term individual interest. The third
phase of interest development is called emerging individual interest, marking a transition to
individual interest. This phase is characterized by an individual’s tendency to reengage with
tasks and to generate his or her own curiosity questions without much external support as
well as the individual’s (?) positive feelings. The last phase is referred to as well-developed
individual interest, a person’s deep-seated interest that involves a tendency to engage, with
positive feelings, with a topic over an extended period of time. Although the four-phase
model of interest development has been generally accepted, the model is underspecified and
has received limited empirical support. For example, the model does not provide a
psychological mechanism explaining how the transition to the next phase occurs. More
research is needed to achieve a better understanding of interest development.
                                             190
               Foundations of Learning and Instructional Design Technology
Much research on interest has focused on examining the relationship between interest and
text-based learning. Studies that have investigated the effects of situational interest have
reported a moderate correlation between text learning and text-based features that
facilitate situational interest; such a relation is independent of other text-based factors such
as text length, nature of text, readability, and so on (Schiefele, 1996). Research on the
effects of individual interest yielded results similar to those found with situational interest.
Schiefele (1996) reported in his meta-analysis an average correlation of .27 between
individual interest (i.e., topic interest) and text-based learning. The effects of individual
interest on text learning were not influenced by other factors (e.g., text length, reading
ability) but were less prominent than the effects of prior knowledge on learning (Schiefele,
2009).
These various motivation theories show that motivation is complex and multidimensional.
Also, motivational states can be influenced by various factors in an environment. This means
that students’ lack of motivation can be caused by various sources. As such, in order to
design an intervention to promote student motivation, it is indispensable to identify the
sources of low motivation in a given situation. Designing strategies to influence people’s
motivation is a problem-solving process. Like the traditional instructional design process,
motivational design includes a systematic process of identifying goals (or motivational
problems), developing strategies for goal attainment (of addressing motivational problems),
and evaluating the outcome of the strategies. Within the instructional design and technology
community, the most well-known motivational design model is John M. Keller’s (1987) ARCS
model.
The shared attributes of the different motivational concepts constitute the acronym ARCS,
attention, relevance, confidence, and satisfaction, representing Keller’s four categories of
learner motivation (Keller, 2010). The ARCS model describes strategies for stimulating and
sustaining motivation in each of the four categories as well as a systematic process of
motivational design.
The first category, attention, is related to stimulating and maintaining learners’ interests.
Learner’s attention is required before any learning can take place. This attention should
also be sustained in order to keep learners focused and engaged. Keller (2010) describes
three categories of attention-getting strategies: perceptual arousal, inquiry arousal, and
variability. Perceptual arousal refers to capturing interest by arousing learners’ senses and
emotions. This construct is conceptually similar to triggered situational interest in Hidi and
Renninger’s (2006) development of interest. Likewise, perceptual arousal is usually
transitory. One of the most common ways to provoke perceptual arousal is making an
unexpected change in the environment. Example tactics include a change in light, a sudden
pause, and presenting a video after text-based information in an online learning
                                              191
               Foundations of Learning and Instructional Design Technology
The second category, relevance, refers to making the learning experience personally
relevant or meaningful. According to the goal theory, students engage in learning activities
that help to attain their goals (Locke & Latham, 1984). Also, as described in expectancy-
value theory and self-determination theory, the perceived value of task is a critical
antecedent of motivation (Deci & Ryan, 2000; Wigfield & Eccles, 1992). One way to
establish the perceived relevance of the learning materials is to use authentic or real-world
examples and assignments. Simply relating the instruction to what is familiar to learners
(e.g., prior knowledge) can also help learners to perceive its relevance.
The confidence category is pertinent to self-efficacy and expectancies for success of the
expectancy-value theory. According to self-determination theory, the feeling of competence
is one of the basic human needs (Ryan & Deci, 2000). If the learners’ need for competence is
not satisfied during learning, they would develop low expectancies for success and
demonstrate low self-efficacy, which results in poor motivation to learn (Bandura, 1997;
Wigfield & Eccles, 2000). Strategies to enhance self-efficacy, such as experience of success,
can be applied in order to build confidence in instruction. Another way to enhance
confidence is to foster learners’ belief that they have control over their performance.
Autonomy support such as providing choices and making internal, controllable attributions
are a few examples.
The last category, satisfaction, concerns learner’s continued motivation to learn. If they
experience satisfying outcomes, students are likely to develop a persistent desire to learn
(Skinner, 1963). Satisfying or positive consequences of instruction can result from both
extrinsic and intrinsic matters (Ryan & Deci, 2000). High grades, certificates, and other
tangible rewards are the most common extrinsic outcomes. However, these extrinsic
rewards may not always result in feelings of satisfaction. For example, a student is not
pleased at the high score that he or she received on a final exam because the test was
extremely easy and most students did well. If the extrinsic rewards fail to fulfill learners’
inner needs, students won’t be satisfied. Such intrinsic consequences that lead to
satisfaction include a feeling of mastery and the pleasure of accomplishing a challenging
task.
Besides identifying the four major categories of motivational design, the ARCS model
describes 10 steps for a systematic process of motivational design (Keller, 2010). The first
four steps are the analysis process. This includes acquisition of course and audience
information and analysis of audience motivation and existing materials. The main goal of
these steps is to identify motivational problems. The next four steps (Step 5 through Step 8)
correspond to the design phase in the traditional instructional design process. The first task
in the design phase is to determine the motivational behaviors of learners that you wish to
                                              192
               Foundations of Learning and Instructional Design Technology
observe based on the motivational problems identified in the previous steps. Then, you
select or design motivational tactics that help to achieve the objectives and can be feasibly
incorporated into instruction. One important task is to integrate these tactics into
instructional materials. Designers are to determine where and how to insert the
motivational tactics in the instruction. In this process, they may need to modify the design of
instruction. Steps 9 and 10 are the development and evaluation phases. After identifying the
motivational tactics to use, designers will develop the actual motivational materials. Lastly,
they will evaluate the effectiveness of the embedded motivational tactics, for instance, by
collecting learner’s reactions to the learning materials. Table 2 summarizes the steps of
motivational design.
                                             193
               Foundations of Learning and Instructional Design Technology
Van der Meij, van der Meij, and Harmsen (2015) developed a motivational animated
pedagogical agent (MAPA) to promote students’ perceived task relevance and self-efficacy
in an inquiry learning environment. In the study, students used SimQuest to learn
kinematics in a physics class and MAPA was presented in SimQuest with a face and an
upper body visible. Acting as a fellow student, MAPA delivered motivational audible
messages to students. The motivational messages were designed based on strategies for
enhancing relevance and confidence described in the ARCS model. The study reported a
significant increase in students’ self-efficacy after using MAPA (van der Meji et al. 2015).
Kim and Bennekin (2013, 2016) developed a Virtual Change Agent (VCA) that provided
support for community college students’ motivation and persistence in online mathematics
courses. The VCA was an animated, human-like, three-dimensional character that delivered
messages containing strategies based on theories of motivation, volition, and emotional
regulation. For example, the VCA told students a story of applying mathematics for
comparing cell phone plans in order to arouse students’ interest and curiosity. After using
the VCA, students showed a significant increase in their self-efficacy and perceived value of
learning mathematics (Kim & Bennekin, 2013).
Another similar technology called Virtual Tutee System (VTS) was designed to facilitate
college students’ reading motivation and engagement (Park & Kim, 2012). In the VTS,
students become a tutor of a virtual tutee (a human-like virtual character) and teach the
tutee about the content they have learned from readings. Capitalizing on the motivational
aspects of learning-by-teaching effects, the VTS-embedded strategies support the needs for
autonomy, competence, and relevance described in self-determination theory. The VTS was
used in a few studies and found to promote students’ reading engagement and their deep
learning (Park & Kim, 2015, 2016).
Summary
Motivation is a so-called prerequisite to learning. As such, it has long been of interest
among many educational researchers. This chapter introduced social cognitive theories of
motivation. These theories, which continue to expand, have contributed significantly to the
understanding of learner motivation. The theories of motivation have also yielded important
implications for the instructional design process. In particular, Keller’s ARCS model
specifies how we take learner motivation into account when designing instruction.
Expanding upon Keller’s work, researchers have devised many technologies that aim to
boost learner motivation. This chapter has presented an introduction to a few of those
technologies.
References
Ames, C., & Archer, J. (1988). Achievement goals in the classroom: Students’ learning
strategies and motivation processes. Journal of Educational Psychology, 80, 260–267.
doi:10.1037/0022-0663.80.3.260
                                             194
                Foundations of Learning and Instructional Design Technology
Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman.
Bong, M. (2001). Role of self-efficacy and task-value in predicting college students’ course
performance and future enrollment intentions. Contemporary Educational Psychology, 26,
553–570. doi:10.1006/ceps.2000.1048
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human
behavior. New York, NY: Plenum.
Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and
the self-determination of behavior. Psychological Inquiry, 11, 227–268.
Dennissen, J. J. A., Zarret, N. R., & Eccles, J. S. (2007). I like to do it, I’m able, and I know I
am: Longitudinal couplings between domain-specific achievement, self-concept, and
interest. Child Development, 78, 430–447.
Dickey, M. D. (2007). Game design and learning: a conjectural analysis of how massively
multiple online role-playing games (MMORPGs) foster intrinsic motivation. Educational
Technology Research & Development, 55, 253-273.
Durik, A. M., Shechter, O. G., Noh, M., Rozek, C. S., & Harackiewicz, J. M. (2015) What if I
can’t? Success expectancies moderate the effects of utility value information on situational
interest and performance. Motivation and Emotion, 39(1), 104-118.
Elliot, A. J., & Church, M. A. (1997). A hierarchical model of approach and avoidance
achievement motivation. Journal of Personality and Social Psychology, 72, 218–232.
doi:10.1037/0022-3514.72.1.218
Elliot, A. J., & Harackiewicz, J. M. (1996). Approach and avoidance achievement goals and
intrinsic motivation: A mediational analysis. Journal of Personality and Social Psychology,
70, 461–475.
Greene, B. A., Miller, R. B., Crowson, H. M., Duke, B. L., & Akey, K. L. (2004). Predicting
high school students’ cognitive engagement and achievement: Contributions of classroom
                                                195
               Foundations of Learning and Instructional Design Technology
Harackiewicz, J. M., Barron, K. E., Pintrich, P. R., Elliot, A. J., & Thrash, T. M. (2002).
Revision of achievement goal theory: Necessary and illuminating. Journal of Educational
Psychology, 94, 638–645. doi:10.1037/0022-0663.94.3.638
Hidi, S., & Renninger, K. A. (2006). The four-phase model of interest development.
Educational Psychologist, 41, 111–127. doi:10.1207/s15326985ep4102_4
Howell, A. J., & Watson, D. C. (2007). Procrastination: Associations with achievement goal
orientation and learning strategies. Personality and Individual Differences, 43, 167-178.
Hulleman, C. S., Schrager, S. M., Bodmann, S. M., & Harackiewicz, J. M. (2010). A meta-
analytic review of achievement goal measures: Different labels for the same constructs or
different constructs with similar labels? Psychological Bulletin, 136, 422–449.
doi:10.1037/a0018947
Joo, Y. J., Lim, K. Y., & Kim J. (2013). Locus of control, self-efficacy, and task value as
predictors of learning outcome in an online university context. Computers & Education, 62,
149-158.
Keller, J. M. (1987). Development of use of the ARCS model of motivational design. Journal
of Instructional Development, 10(3), 2-10.
Keller, J. M. (2010). Motivational design for learning and performance: The ARCS model
approach. New York, NY: Springer.
Keys, T. D., Conley, A. M., Duncan, G. J., & Domina, T. (2012). The role of goal orientations
for adolescent mathematics achievement. Contemporary Educational Psychology, 37, 47–54.
doi:10.1016/j.cedpsych.2011.09.002
Kim, C., & Bennekin, K. N. (2013). Design and implementation of volitional control support
in mathematics courses. Educational Technology Research & Development, 61, 793-817.
Kim, C., & Bennekin, K. N. (2016). The effectiveness of volition support (VoS) in promoting
students’ effort regulation and performance in an online mathematics course. Instructional
Science, 44(4), 359-377.
Kirriemuir, J. (2002). Video gaming, education and digital learning technologies. D-Lib
Magazine, 8(2).
Krapp, A. (2005). Basic needs and the development of interest and intrinsic motivational
orientations. Learning and Instruction, 15, 381–395.
                                             196
               Foundations of Learning and Instructional Design Technology
Linnenbrink-Garcia, L., Tyson, D. F., & Patall, E. A. (2008). When are achievement goal
orientations beneficial for academic achievement? A closer look at moderating factors.
International Review of Social Psychology, 21, 19–70.
Locke, E. A., & Latham, G. P. (1984). Goal setting: A motivational technique that works!
Englewood Cliffs, NJ: Prentice-Hall.
Locke, E. A., & Latham, G. P. (2006). New directions in goal-setting theory. Current
Directions in Psychological Science, 15, 265–268. doi:10.1111/j.1467-8721.2006.00449.x
Maehr, M. L., & Midgley, C. (1991). Enhancing student motivation: A schoolwide approach.
Educational Psychologist, 26(3), 399–427.
Maehr, M. L., & Zusho, A. (2009). Achievement goal theory: The past, present, and future.
In K. R. Wentzel & A. Wigfield (Eds.) Handbook of motivation at school (pp.77-104). New
York, NY: Routledge.
Meece, J. L., Anderman, E. M., & Anderman, L. H. (2006). Classroom goal structure, student
motivation, and academic achievement. Annual Review of Psychology, 57, 487–503.
doi:10.1146/annurev.psych.56.091103.070258
van der Meij, H., van der Meij, J., and Harmsen, R. (2015). Animated pedagogical agents
effects on enhancing student motivation and learning in a science inquiry learning
environment. Educational Technology Research & Development, 63, 381-403.
Middleton, M. J., & Midgley, C. (1997). Avoiding the demonstration of lack of ability: An
underexplored aspect of goal theory. Journal of Educational Psychology, 89(4), 710–718.
Park, S. W., & Huynh, N. T. (2015) How are non-geography majors motivated in a large
introductory world geography course? Journal of Geography in Higher Education, 39(3),
386-406. doi:10.1080/03098265.2015.1048507
Park, S. W., & Kim, C (2012). A design framework for a virtual tutee system to promote
academic reading engagement in a college classroom. Journal of Applied Instructional
Design, 2(1), 17-33
Park, S. W., & Kim, C. (2015). Boosting learning-by-teaching in virtual tutoring. Computers
& Education, 82, 129-140.
Park, S. W., & Kim, C. (2016). The effects of a virtual tutee system on academic reading
engagement in a college classroom. Educational Technology Research and Development,
64(2), 195-218.
                                            197
               Foundations of Learning and Instructional Design Technology
Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic
motivation, social development, and well-being. American Psychologist, 55, 68–78.
doi:10.1037/0003-066X.55.1.68
Schiefele, U. (2001). The role of interest in motivation and learning. In J.M. Collis & S.
Messick (Eds.), Intelligence and personality: Bridging the gap in theory and measurement
(pp. 163–194). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Schunk, D. H., Meece, J. L., & Pintrich, P. R. (2014). Motivation in education: Theory,
research, and applications (4th ed.). Upper Saddle River, NJ: Pearson.
Tüzün, H., Yilmaz-Soylu, M., Karakus, T., & Înal, Y., & Kizlkaya, G. (2009). The effects of
computer games on primary school students’ achievement and motivation in geography
learning. Computers & Education, 52, 68-77.
Wentzel, K. R. (2000). What is it that I’m trying to achieve? Classroom goals from a content
perspective. Contemporary Educational Psychology, 25, 105–115.
Weiner, B. (1986). An attributional theory of motivation and emotion. New York, NY:
Springer-Verlag.
Wigfield, A., & Eccles, J. (1992). The development of achievement task values: A theoretical
                                              198
               Foundations of Learning and Instructional Design Technology
Wolters, C. A. (2004). Advancing achievement goal theory: Using goal structures and goal
orientations to predict students’ motivation, cognition, and achievement. Journal of
Educational Psychology, 96(2), 236–250.
Suggested Citation
                                            199
                         Seung Won Park
Seung Won Park is a research professor in the Institute for Teaching and
Learning at Daejeon University in South Korea. Her research interest includes
enhancing learner motivation and engagement in student-centered, technology-
enhanced learning environments.
                                     200
                                             17
Informal Learning
Tim Boileau
In today’s digitally connected world we are constantly acquiring new personal knowledge
and skills, discovering new methods of work and ways to earn a living, solving problems,
and changing the way we create, access and share information, through informal learning.
The topic of informal learning can be discussed in many different contexts and from a
variety of theoretical perspectives. For the purposes of this chapter, informal learning is
examined through the lens of lifelong learning and performance, primarily as it relates to
adult learners. Jay Cross (2007) may be credited for popularizing the term “informal
learning”, although he claimed to have first heard it from Peter Henschel sometime during
the mid-1990s, who at the time was director of the Institute for Research on Learning (IRL),
when he said:
      People are learning all the time, in varied settings and often most effectively in
      the context of work itself. ‘Training’—formal learning of all kinds—channels
      some important learning but doesn’t carry the heaviest load. The workhorse of
      the knowledge economy has been, and continues to be, informal learning.
      (Cross, 2007, p. xiii)
Indeed, the concept of informal learning has been around for many years preceding the
peak of the industrial revolution during the 19th century in the form of guild support for
traditional apprenticeships, and is ubiquitous in the knowledge-based economy of the 21st
century in the form of cognitive apprenticeship (Collins & Kapur, 2014).
                                              201
               Foundations of Learning and Instructional Design Technology
between existing knowledge and skills, and expected performance. According to Boileau
(2011, p. 13), “Humans learn when they perceive a need to know, and evidence of learning
is in their ability to do something they could not do before.”
In this chapter, I explore the nuances of informal learning to better understand its unique
role in lifelong learning and performance. The remainder of this chapter is organized in the
following sections:
Learning may also be described using different classifications linked to the setting or
circumstances in which the learning is most likely to occur, such as formal, non-formal, and
informal learning. Taking a brief look at this typology, formal learning implies learning
settings provided by educational institutions where the primary mission is the construction
of new knowledge. Non-formal learning settings may be found in organizations and
businesses within the community where the primary mission is not necessarily educational.
However, formal learning activities are present such as in the delivery of specialized
training that is linked to achieving the goals of the organization (Coombs, Prosser, &
                                             202
               Foundations of Learning and Instructional Design Technology
Ahmed, 1973). Informal learning, on the other hand, refers to embedded learning activities
that are linked to performance, in the setting of one’s everyday life. Within each category,
there are identifiable subcategories representing different learning taxonomies. Merriam
and Bierema (2014) identified four sub-types of learning specific to informal learning, which
can be summarized as:
In the training industry, informal learning is often discussed in the context of the “70:20:10
Rule” (please see Association for Talent Development (ATD) at www.td.org
[http://www.td.org/] and do a search on 70:20:10). Generally speaking, this suggests that:
70% of learning occurs through informal or on-the-job learning; 20% through mentoring and
other specialized developmental relationships; and the remaining 10% through formal
learning including course work and associated reading.
There are two important takeaways from the assertion of the so-called 70:20:10 Rule, as it
relates to workplace learning. First, there is a growing body of research providing insight
into just how widespread and embedded informal learning is in the lives of adult learners,
with estimates of as high as 70-90% of all learning over the course of a lifetime, occurring
via informal learning activity (Merriam & Bierema, 2014). Specific to learning about
science, Falk and Dierking (2010) placed the estimate even higher, with as much as 95% of
all science learning occurring outside of school, given the richness, availability, and
increased access to “free-choice” (i.e., informal) digital learning resources. Based on this
premise, Falk and Dierking (2010) contended that a policy of increased investment in
informal learning resources would provide a cost-effective way to increase public
understanding of science. The second takeaway is the recognition that formal and informal
learning occurs along a continuum—comprised of both formal and informal learning
activities, depending on the type of learning and level of mastery required, as well as the
characteristics and prior experience of the learner—as opposed to dichotomous categories
of formal vs. informal learning (Sawchuk, 2008).
                                             203
               Foundations of Learning and Instructional Design Technology
In contrast, in formal learning, learning objectives and curricula are determined by someone
                                             204
               Foundations of Learning and Instructional Design Technology
else. Formal learning or “book learning” is what most people in western culture think of
when they envision learning in terms of schools, classrooms, and instructors who decide
what, when, and how learning is to take place. “Formal learning is like riding a bus: the
driver [instructor] decides where the bus is going; the passengers [learners] are along for
the ride” (Cross, 2007, p. 236). In formal learning, learning is “pushed” to the learners
according to a set of needs or predetermined curricula that are established by someone
other than the learner.
In this section, I have discussed informal and formal learning as co-existing in a spectrum or
continuum of learning activities linked to experience and performance context over the
course of a lifetime, as opposed to dichotomous branches of learning that are fixed in time
and space. This is an important precept to keep in mind because increasingly, blended
learning experiences may include elements or activities associated with formal learning
settings such as lectures or media-based presentations, along with informal learning
activities such as discussions with peers, Web-based searches for examples, and practice
experimenting with new techniques and tools (Lohman, 2006).
Science learning. There is increased recognition of the need to support lifelong science
learning in order to meet the growing demand for science and engineering jobs in a modern
global economy. It can be argued that science literacy, acquired through informal learning,
is essential to economic growth (as discussed in the next topic), and to promoting the shared
cultural values of a democratic society. According to Falk et al. (2007), “the majority of the
public constructs much of its understanding of science over the course of their lives,
gathering information from many places and contexts, and for a diversity of reasons.” (p.
455). Evidence of this trend can be seen in new standards for compulsory testing and
curriculum changes, placing greater emphasis on STEM (science, technology, engineering,
and mathematics) subjects in publicly funded K-12 education. Yet, the average adult spends
a small fraction of their life (1-3 percent) in formal education related to science learning
(Falk, Storksdieck, & Dierking, 2007). Indeed, the research literature suggests that most
science learning, as with other domains of learning, occurs informally and is driven by self-
identified needs and interests of learners. This suggests that informal learning activities
                                             205
               Foundations of Learning and Instructional Design Technology
within the workplace, personal investigation using internet-based tools and resources, and
active leisure pursuits such as visits to museums, zoos and aquariums, and national parks
account for the majority of science learning in America (Falk & Dierking, 2010).
Other forms of informal science learning include hobbies such as model rockets and drones,
organic and sustainable farming, beekeeping, mineralogy, and amateur astronomy. Life
events may also trigger a personal need for informal science learning via the web such as
when individuals are diagnosed with an illness like cancer or heart disease, or in the wake
of environmental disasters such as oil spills, the discovery of radon gas in rock, or tracking
the path of a hurricane. The Internet now represents the major source of science
information for adults and children, with the tipping point occurring in 2006, when the
Internet surpassed broadcast media as a source for public science information, according to
the Pew Internet and American Life Project (Falk & Dierking, 2010). In a similar fashion,
more people now turn to the Internet for medical diagnostic information using services like
WebMD.com, before scheduling an appointment with their physician.
                                             206
               Foundations of Learning and Instructional Design Technology
hoc learning processes. This structure begins with activation of prior knowledge, followed
by demonstration/practice, feedback session, and transfer strategy. In addition, all
microtraining segments should promote critical thinking and reflection on work, to facilitate
deeper learning.
                                              207
               Foundations of Learning and Instructional Design Technology
to knowledge retention and recollection. There are additional affordances that may be
associated with microlearning, which include:
                                            208
               Foundations of Learning and Instructional Design Technology
shaped by and, in turn, shape the way PSTs are used to support learning and to affect the
transfer of knowledge and skills to on-the-job performance. Information and
communications technology (e.g., social media) has been shown to have a mediating effect
on practice, using digital representation of signs and symbols for linguistic communication,
along with knowledge objects that are produced and exist within the community (Boileau,
2011).
As previously stated, Rossett and Schafer (2007) viewed this effect on practice in terms of
support for performance, specifically by building a repository of externally curated
information, processes, resources and perspectives that inform and guide performance
planning and execution, using performance support tools. This approach is less concerned
with new knowledge acquisition and more so in direct application and transfer of
knowledge, mediated by PSTs.
Rossett and Schafer (2007) further categorized PSTs as sidekicks and planners. A sidekick
functions as a job aid in the context of specific types of activity performed in realtime,
concurrent with the task at hand. An example of a sidekick is a GPS navigation system (e.g.,
Google® maps application on a mobile device) providing turn-by-turn navigational
instructions in the situated context of operating a vehicle.
A planner, on the other hand, is typically used in advance of the activity to access prior,
externally created, knowledge shared by the community of practice, for use in a specific
learning and performance context. An example of this would be accessing Google® Maps
via the Web to determine (i.e., plan) the most efficient route of travel between two pre-
determined points, in advance of starting the trip.
A distinction can be made between performance support tools and other types of tools such
as a file cabinet or office chair, used to support informal learning and performance. The
difference with non-PST tools is that there is no innate support for the informal learning or
performance activity; there is only potential support for manipulating the environment to
make it more conducive to achieving the goal for the activity. In a similar manner,
“Instruction is not performance support. It is planned experience that enables an individual
to acquire skills and knowledge to advance the capability to perform.” (Rossett & Schafer,
2007, p. 5). In other words, there is a separation between the learning event and the
performance context. Performance support for informal learning may be further
characterized by looking at four factors: convergence, simplicity, relevance to performance,
and personalization (Rossett & Schafer, 2007).
                                             209
               Foundations of Learning and Instructional Design Technology
      Personalization allows the learner to dynamically adjust the level of information and
      support needed, according to the needs of the situation and the prior experience of
      the learner. Personalization also facilitates user-generated content adding new insight
      and lessons learned, thus increasing the utility of the tool and contributing new
      artifacts to the collective body of knowledge available to the community of practice,
      via a more integrative user experience.
Digital open badging. As opportunities for informal learning continue to increase for
personal and professional development across different industries and disciplines, a
question on the minds of many learners is how informal learning achievements may be
recognized (Law, 2015). Digital open badges provide validated recognition of participation
and achievement from informal learning activities, and evidence of learning milestones such
as completion of a microtraining learning segment. The use of digital badges can also be
seen with formal learning in educational institutions, as a motivational tool and in the form
of micro-credentials to demonstrate incremental achievement in a variety of education
settings.
The amount of OER content available to support informal learning has increased
exponentially in recent years in support of microlearning. Concurrent with the increase in
OER is the emergence of different business models to support the issuance of digital open
badges. For example, learners can access OER content for free, through a variety of MOOC
(Massive Open Online Course) service providers such as EdX and Coursera. These services
provide access to hundreds of courses for free. If you would like to receive a micro-
credential (i.e., certificate) as evidence of successful completion, however, you are required
to pay a nominal fee. This changes our definition of informal learning provided by Cross
(2007) when learners begin to pay-for certification by MOOC providers, because informal
learning is no longer anonymous when attendance is tracked and grades are issued (Law,
2015). This trend is expected to continue according to Law (2015) as “learners in an
informal environment are willing to pay for certification and recognition of unsupported
informal learning.” (p. 232).
Summary. In this section, we have examined some of the trends, issues, and tools used to
facilitate informal learning, noting the emergence of four themes. First is that informal
learning is situated in performance, knowledge development, or in completion of a task, and
is driven by intrinsic as well as extrinsic motivation. Second, as organizations refocus their
attention from training to talent management, they look to innovative methods and learner-
centered processes to enable communities of practice. Third is that technology and more
specifically, performance support tools are at the forefront of informal learning, serving as
job-aids intended to mediate informal learning activities that support job performance.
Finally, the use of digital open badges is expected to increase, to eventually provide
validated evidence of informal learning outcomes.
                                             210
               Foundations of Learning and Instructional Design Technology
Merriam and Bierema (2014) identified three themes in knowing and learning that are more
prevalent among non-Western cultures, characterized as communal, lifelong and informal,
and holistic. To say that learning is communalimplies that it is situated within the
community as a means for collaborative knowledge development that benefits from, and
exists within, the entire community through strong interdependency and relationships
among the members. This stands in contrast with Western culture in which the learner is
more typically viewed from an individualistic and independent perspective. The second
theme is that informal learning is a lifelong pursuit that is also situated within the
communal ethic (Merriam & Kim, 2011). The concept of informal lifelong learning is evident
in the Buddhist principles of mindfulness; can be seen in the African cultural expectation
that members of the community share their knowledge with each other for the benefit of the
community at large; and may be found in the words of the Prophet Muhammad: to “Seek
knowledge from the cradle to the grave.” Finally, the culturally-based theme of informal
learning as holistic represents a clear shift from a Western emphasis on cognitive knowing,
to alternative types of learning that include: somatic, spiritual, emotional, moral,
experiential and social learning (Merriam & Kim, 2011).
Approaching informal learning from a more culturally holistic perspective creates new
opportunities to increase cultural sensitivity among increasingly diverse learner and worker
populations, by recognizing that learning is embedded in performance activities and in the
experiences of everyday life.
                                              211
               Foundations of Learning and Instructional Design Technology
Application Exercises
          Take five minutes and think about your own experiences with informal
          learning. How has technology influenced your informal learning? Give your
          best assumption of how much informal learning occurs outside of a
          technological medium vs. how much informal learning occurs through a
          technological medium.
          Think of a work or school situation where learning was formal. Knowing that
          there is a better chance of meeting learning outcomes with informal learning,
          what adjustments would you make to create a more informal learning
          experience?
References
Boileau, T. (2011). The effect of interactive technology on informal learning and
performance in a social setting. Wayne State University.
Buchem, I., & Hamelmann, H. (2010). Microlearning: a strategy for ongoing professional
development. eLearning Papers, 21(7).
Collins, A., & Kapur, M. (2014). Cognitive apprenticeship. In Sawyer, R.K. (Ed.), The
Learning Sciences (2nd ed., pp. 109-126). New York: Cambridge University Press.
Coombs, P.H., with Prosser, R.C., & Ahmed, M. (1973). New paths to learning for children
and youth. New York: International Council for Educational Development.
Cross, J. (2007). Informal learning: Rediscovering the natural pathways that inspire
Innovation and Performance. San Francisco, CA: Pfeiffer.
Cross, J. (2013). The principles of learning. Retrieved November 13, 2017, from
https://edtechbooks.org/-wB
Czerkawski, B. (2016). Blending formal and informal learning networks for online learning.
The International Review of Research in Open and Distributed Learning, 17(3).
Falk, J., & Dierking, L. (2010). The 95 Percent Solution. American Scientist, 98(6), 486-493.
Retrieved from https://edtechbooks.org/-RG
Falk, J. H., Storksdieck, M., & Dierking, L. D. (2007). Investigating public science interest
and understanding: Evidence for the importance of free-choice learning. Public
Understanding of Science, 16(4), 455-469.
                                              212
               Foundations of Learning and Instructional Design Technology
De Grip, A. (2015). The importance of informal learning at work. IZA World of Labor. doi:
10.15185/izawol.162
De Vries, P., & Brall, S. (2008). Microtraining as a support mechanism for informal learning.
Elearningpapers of Elearningeuropa, on:
http://www.elearningpapers.eu.
Driscoll, M.P., (2005). Psychology of learning for instruction (3rd ed.). Boston, MA: Allyn &
Bacon.
Gery, G.J., (1991). Electronic performance support systems. Cambridge: Ziff Institute.
Greeno, J.G., & Engeström, Y. (2014). Learning in activity. In Sawyer, R.K. (Ed.), The
learning sciences (2nd ed., pp. 109-126). New York, NY: Cambridge University Press.
Henschel, P. (2001). The manager’s core work in the new economy. On the Horizon, 9(3),
1-5. Retrieved from https://edtechbooks.org/-aC
Lave, J., & Wenger, E. (1991). Situated Learning: Legitimate peripheral participation. New
York: Cambridge University Press.
Law, P. (2015). Digital badging at The Open University: recognition for informal learning.
Open Learning: The Journal of Open, Distance and e-Learning, 30(3), 221-234.
Mayer, R.E. (2005). Cognitive theory of multimedia learning. In R.E. Mayer (Ed.), The
cambridge handbook of multimedia learning (pp. 31-48). New York: Cambridge University
Press.
Merriam, S.B., & Bierema, L.L. (2006). Adult learning: linking theory and practice. San
Francisco, CA: Jossey-Bass.
Merriam, S.B., & Kim, Y.S. (2011). Non-western perspectives on learning and knowing. In
Merriam, S.B., & Grace, A.P. (Eds.), The Jossey-Bass reader on contemporary issues in adult
education, pp. 378-389. San Francisco, CA: Jossey-Bass.
Rossett, A., & Schafer, L. (2007). Job aids and performance support: Moving from
knowledge in the classroom to knowledge everywhere. San Francisco, CA: Pfeiffer.
Sawchuk, P.H., (2008). Theories and methods for research on informal learning and work:
Towards cross-fertilization. Studies in Continuing Education, 30(1), 1-16.
                                             213
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        214
                              Tim Boileau
                                       215
                                            18
John R. Savery
Editor’s Note
Introduction
When asked to provide an overview of problem-based learning for the introductory issue of
this journal, I readily agreed, thinking it was a wonderful opportunity to write about a
subject I care about deeply. As I began to jot down ideas about “What is PBL?” it became
clear that I had a problem. Some of what I knew about PBL was learned through teaching
and practicing PBL, but so much more had been acquired by reading the many papers
authored by experts with decades of experience conducting research and practicing
problem-based learning. These authors had frequently begun their papers with a context-
setting discussion of “What is PBL?” What more was there to say?
Origins of PBL
In discussing the origins of PBL, Boud and Feletti (1997) stated:
                                             216
               Foundations of Learning and Instructional Design Technology
Barrows (1994; 1996) recognized that the process of patient diagnosis (doctors’ work) relied
on a combination of a hypothetical-deductive reasoning process and expert knowledge in
multiple domains. Teaching discipline specific content (anatomy, neurology, pharmacology,
psychology, etc.) separately, using a “traditional” lecture approach, did little to provide
learners with a context for the content or for its clinical application. Further confounding
this traditional approach was the rapidly changing knowledge base in science and medicine,
which was driving changes in both theory and practice.
During the 1980s and 1990s the PBL approach was adopted in other medical schools and
became an accepted instructional approach across North America and in Europe. There
were some who questioned whether or not a physician trained using PBL was as well
prepared for professional practice as a physician trained using traditional approaches. This
was a fair question, and extensive research was conducted to answer it. A meta-analysis of
20 years of PBL evaluation studies was conducted by Albanese and Mitchell (1993), and also
by Vernon and Blake (1993), and concluded that a problem-based approach to instruction
was equal to traditional approaches in terms of conventional tests of knowledge (i.e., scores
on medical board examinations), and that students who studied using PBL exhibited better
clinical problem-solving skills. A smaller study of graduates of a physical therapy program
that utilized PBL (Denton, Adams, Blatt, & Lorish, 2000) showed that graduates of the
program performed equally well with PBL or traditional approaches but students reported a
preference for the problem-centered approach. Anecdotal reports from PBL practitioners
suggest that students are more engaged in learning the expected content (Torp & Sage,
2002).
                                             217
               Foundations of Learning and Instructional Design Technology
identifying one’s own strengths and weaknesses and undertaking appropriate remediation
(self-directed learning). A lack of well-designed studies posed a challenge to this research
analysis, and an article on the same topic by Sanson-Fisher and Lynagh (2005) concluded
that “Available evidence, although methodologically flawed, offers little support for the
superiority of PBL over traditional curricula” (p. 260). This gap in the research on the short-
term and long-term effectiveness of using a PBL approach with a range of learner
populations definitely indicates a need for further study.
Despite this lack of evidence, the adoption of PBL has expanded into elementary schools,
middle schools, high schools, universities, and professional schools (Torp & Sage, 2002).
The University of Delaware (http://www.udel.edu/pbl/) has an active PBL program and
conducts annual training institutes for instructors wanting to become tutors. Samford
University in Birmingham, Alabama (http://www.samford.edu/pbl/) has incorporated PBL
into various undergraduate programs within the Schools of Arts and Sciences, Business,
Education, Nursing, and Pharmacy. The Illinois Mathematics and Science Academy
(http://www.imsa.edu/center/) has been providing high school students with a complete PBL
curriculum since 1985 and serves thousands of students and teachers as a center for
research on problem-based learning. The Problem-based Learning Institute (PBLI)
(http://www.pbli.org/) has developed curricular materials (i.e., problems) and teacher-
training programs in PBL for all core disciplines in high school (Barrows & Kelson, 1993).
PBL is used in multiple domains of medical education (dentists, nurses, paramedics,
radiologists, etc.) and in content domains as diverse as MBA programs (Stinson & Milter,
1996), higher education (Bridges & Hallinger, 1996), chemical engineering (Woods, 1994),
economics (Gijselaers, 1996), architecture (Kingsland, 1989), and pre-service teacher
education (Hmelo-Silver, 2004). This list is by no means exhaustive, but is illustrative of the
multiple contexts in which the PBL instructional approach is being utilized.
The widespread adoption of the PBL instructional approach by different disciplines, for
different age levels, and in different content domains has produced some misapplications
and misconceptions of PBL (Maudsley, 1999). Certain practices that are called PBL may fail
to achieve the anticipated learning outcomes for a variety of reasons. Boud and Feletti
(1997, p. 5) described several possible sources for the confusion:
The possible sources of confusion listed above appear to hold a naïve view of the rigor
                                             218
               Foundations of Learning and Instructional Design Technology
required to teach with this learner-centered approach. In the next section I will discuss
some of the essential characteristics and features of PBL.
Characteristics of PBL
PBL is an instructional (and curricular) learner-centered approach that empowers learners
to conduct research, integrate theory and practice, and apply knowledge and skills to
develop a viable solution to a defined problem. Critical to the success of the approach is the
selection of ill-structured problems (often interdisciplinary) and a tutor who guides the
learning process and conducts a thorough debriefing at the conclusion of the learning
experience. Several authors have described the characteristics and features required for a
successful PBL approach to instruction. The reader is encouraged to read the source
documents, as brief quotes do not do justice to the level of detail provided by the authors.
Boud and Feletti (1997) provided a list of the practices considered characteristic of the
philosophy, strategies, and tactics of problem-based learning. Duch, Groh, and Allen (2001)
described the methods used in PBL and the specific skills developed, including the ability to
think critically, analyze and solve complex, real-world problems, to find, evaluate, and use
appropriate learning resources; to work cooperatively, to demonstrate effective
communication skills, and to use content knowledge and intellectual skills to become
continual learners. Torp and Sage (2002) described PBL as focused, experiential learning
organized around the investigation and resolution of messy, real-world problems. They
describe students as engaged problem solvers, seeking to identify the root problem and the
conditions needed for a good solution and in the process becoming self-directed learners.
Hmelo-Silver (2004) described PBL as an instructional method in which students learn
through facilitated problem solving that centers on a complex problem that does not have a
single correct answer. She noted that students work in collaborative groups to identify what
they need to learn in order to solve a problem, engage in self-directed learning, apply their
new knowledge to the problem, and reflect on what they learned and the effectiveness of the
strategies employed.
                                             219
         Foundations of Learning and Instructional Design Technology
                                        220
               Foundations of Learning and Instructional Design Technology
      A rationale and guidelines for the selection of authentic problems in PBL is discussed
      extensively in Savery & Duffy (1995), Stinson and Milter (1996), Wilkerson and
      Gijselaers (1996), and MacDonald (1997). The transfer of skills learned through PBL
      to a real-world context is also noted by Bransford, Brown, & Cocking (2000, p. 77).
      Student examinations must measure student progress towards the goals of problem-
      based learning.
      The goals of PBL are both knowledge-based and process-based. Students need to be
      assessed on both dimensions at regular intervals to ensure that they are benefiting as
      intended from the PBL approach. Students are responsible for the content in the
      curriculum that they have “covered” through engagement with problems. They need
      to be able to recognize and articulate what they know and what they have learned.
      Problem-based learning must be the pedagogical base in the curriculum and not part
      of a didactic curriculum.
Reflection
    The author states, “The problem simulations used in problem-based learning must
    be ill-structured and allow for free inquiry.” Create your own “messy, real-world”
    problem. Decide on a main curriculum area (most good problems are
    interdisciplinary) and an age group. Construct a problem that could be used in a
    problem-based classroom. Share it with two people and get their feedback. Revise
    the problem and submit.
Summary
These descriptions of the characteristics of PBL identify clearly 1) the role of the tutor as a
facilitator of learning, 2) the responsibilities of the learners to be self-directed and self-
regulated in their learning, and 3) the essential elements in the design of ill-structured
instructional problems as the driving force for inquiry. The challenge for many instructors
when they adopt a PBL approach is to make the transition from teacher as knowledge
provider to tutor as manager and facilitator of learning (see Ertmer & Simons, 2006). If
teaching with PBL were as simple as presenting the learners with a “problem” and students
could be relied upon to work consistently at a high level of cognitive self-monitoring and
self-regulation, then many teachers would be taking early retirement. The reality is that
learners who are new to PBL require significant instructional scaffolding to support the
development of problem-solving skills, self-directed learning skills, and
teamwork/collaboration skills to a level of self-sufficiency where the scaffolds can be
removed. Teaching institutions that have adopted a PBL approach to curriculum and
instruction (including those noted earlier) have developed extensive tutor-training programs
in recognition of the critical importance of this role in facilitating the PBL learning
                                             221
               Foundations of Learning and Instructional Design Technology
experience. An excellent resource is The Tutorial Process by Barrows (1988), which explains
the importance of the tutor as the metacognitive coach for the learners.
Given that change to teaching patterns in public education moves at a glacial pace, it will
take time for institutions to commit to a full problem-based learning approach. However,
there are several closely related learner-centered instructional strategies, such as project-
based learning, case-based learning, and inquiry-based learning, that are used in a variety
of content domains that can begin to move students along the path to becoming more self-
directed in their learning. In the next section I examine some of similarities and differences
among these approaches.
Project-based learning is similar to problem-based learning in that the learning activities are
organized around achieving a shared goal (project). This instructional approach was
described by Kilpatrick (1921), as the Project Method and elaborated upon by several
researchers, including Blumenfeld, Soloway, Marx, Krajcik, Guzdial, and Palinscar (1991).
Within a project-based approach learners are usually provided with specifications for a
desired end product (build a rocket, design a website, etc.) and the learning process is more
oriented to following correct procedures. While working on a project, learners are likely to
encounter several “problems” that generate “teachable moments” (see Lehman, George,
Buchanan, & Rush, this issue). Teachers are more likely to be instructors and coaches
(rather than tutors) who provide expert guidance, feedback and suggestions for “better”
ways to achieve the final product. The teaching (modeling, scaffolding, questioning, etc.) is
provided according to learner need and within the context of the project. Similar to case-
based instruction learners are able to add an experience to their memory that will serve
them in future situations.
While cases and projects are excellent learner-centered instructional strategies, they tend to
diminish the learner’s role in setting the goals and outcomes for the “problem.” When the
                                             222
               Foundations of Learning and Instructional Design Technology
expected outcomes are clearly defined, then there is less need or incentive for the learner to
set his/her own parameters. In the real world it is recognized that the ability to both define
the problem and develop a solution (or range of possible solutions) is important.
The use of PBL in undergraduate education is changing gradually (e.g., Samford University,
University of Delaware) in part because of the realization by industry and government
                                              223
               Foundations of Learning and Instructional Design Technology
leaders that this information age is for real. At the Wingspread Conference (1994) leaders
from state and federal governments and experts from corporate, philanthropic, higher
education, and accreditation communities were asked for their opinions and visions of
undergraduate education and to identify some important characteristics of quality
performance for college and university graduates. Their report identified as important high-
level skills in communication, computation, technological literacy, and information retrieval
that would enable individuals to gain and apply new knowledge and skills as needed. The
report also cited as important the ability to arrive at informed judgments by effectively
defining problems, gathering and evaluating information related to those problems, and
developing solutions; the ability to function in a global community; adaptability; ease with
diversity; motivation and persistence (for example being a self-starter); ethical and civil
behavior; creativity and resourcefulness; technical competence; and the ability to work with
others, especially in team settings. Lastly, the Wingspread Conference report noted the
importance of a demonstrated ability to deploy all of the previous characteristics to address
specific problems in complex, real-world settings, in which the development of workable
solutions is required. Given this set of characteristics and the apparent success of a PBL
approach at producing graduates with these characteristics one could hope for increased
support in the use of PBL in undergraduate education.
The adoption of PBL (and any other instructional innovation) in public education is a
complicated undertaking. Most state-funded elementary schools, middle schools, and high
schools are constrained by a state-mandated curriculum and an expectation that they will
produce a uniform product. High-stakes standardized testing tends to support instructional
approaches that teach to the test. These approaches focus primarily on memorization
through drill and practice, and rehearsal using practice tests. The instructional day is
divided into specific blocks of time and organized around subjects. There is not much room
in this structure for teachers or students to immerse themselves in an engaging problem.
However, there are many efforts underway to work around the constraints of traditional
classrooms (see, for example, PBL Design and Invention Center -http://www.pblnet.org/, or
the PBL Initiative—http://www.pbli.org/core.htm), as well as the article by Lehman and his
colleagues in this issue. I hope in future issues of this journal to learn more about
implementations of PBL in K–12 educational settings.
                                             224
               Foundations of Learning and Instructional Design Technology
Application Exercises
References
Albanese, M. A., & Mitchell, S. (1993). Problem-based learning: A review of the literature on
its outcomes and implementation issues. Academic Medicine, 68 (1), 52-81.
Barrows, H. S. (1988). The tutorial process. Springfield: Southern Illinois University School
of Medicine.
Barrows, H. S., & Kelson, A. (1993). Problem-based learning in secondary education and the
Problem-based Learning Institute (Monograph). Springfield: Southern Illinois University
School of Medicine.
Blumenfeld, P. C., Soloway, E., Marx, R. W., Krajcik, J. S., Guzdial, M., & Palinscar, A.
(1991). Motivating project-based learning: Sustaining the doing, supporting the learning.
Educational Psychologist, 26 (3/4), 369-398.
Boud, D., & Feletti, G. (1997). The challenge of problem-based learning (2nd ed.). London:
Kogan Page.
Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain,
mind, experience, and school. Washington, DC: National Academy Press.
                                             225
               Foundations of Learning and Instructional Design Technology
Denton, B. G., Adams, C. C., Blatt, P. J., & Lorish, C. D. (2000). Does the introduction of
problem-based learning change graduate performance outcomes in a professional
curriculum? Journal on Excellence in College Teaching, 11 (2&3), 147-162.
Duch, B. J., Groh, S. E., & Allen, D. E. (2001). Why problem-based learning? A case study of
institutional change in undergraduate education. In B. Duch, S. Groh, & D. Allen (Eds.), The
power of problem-based learning (pp. 3-11). Sterling, VA: Stylus.
Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: Implications for the design and
delivery of instruction. In D. Jonassen (Ed.), Handbook of research for educational
communications and technology. New York: Macmillan.
Ertmer, P. A., & Simons, K. D. (2006). Jumping the PBL implementation hurdle: Supporting
the efforts of K–12 teachers. Interdisciplinary Journal of Problem-based Learning, 1 (1),
40-54.
Kilpatrick, W. H. (1921). Dangers and difficulties of the project method and how to
overcome them: Introductory statement: Definition of terms. Teachers College Record, 22
(4), p. 283-287 (ID Number: 3982) Retrieved January 23, 2006 from
http://www.tcrecord.org.
Maudsley, G. (1999) Do we all mean the same thing by “problem-based learning”? A review
of the concepts and a formulation of the ground rules. Academic Medicine, 74(2), 178-85.
                                              226
               Foundations of Learning and Instructional Design Technology
Savery, J.R., & Duffy, T.M. (1995). Problem-based learning: An instructional model and its
constructivist framework. In B. Wilson (Ed.), Constructivist learning environments: Case
studies in instructional design (pp. 135-148). Englewood Cliffs, NJ: Educational Technology
Publications.
Savery, J. R. (1999). Enhancing motivation and learning through collaboration and the use of
problems. In S. Fellows & K. Ahmet (Eds.), Inspiring students: Case studies in motivating
the learner (pp. 33-42). London: Kogan Page.
Steinwachs, B. (1992). How to facilitate a debriefing. Simulation & Gaming, 23(2) 186-195.
Thiagarajan, S. (1993). How to maximize transfer from simulation games through systematic
debriefing. In F. Percival, S. Lodge & D. Saunders (Eds.), The Simulation and Gaming
Yearbook, vol. 1 (pp. 45-52). London: Kogan Page.
Torp, L., & Sage, S. (2002). Problems as possibilities: Problem-based learning for K-16
education (2nd ed.). Alexandria, VA: Association for Supervision and Curriculum
Development.
Vernon, D. T. A., & Blake, R. L. (1993). Does problem-based learning work? A meta-analysis
of evaluation research. Academic Medicine, 68(7), 550-563.
White, H. B. (1996). Dan tries problem-based learning: A case study. In L. Richlin (Ed.), To
Improve the Academy, vol. 15 (pp. 75-91). Stillwater, OK: New Forums Press and the
Professional and Organizational Network in Higher Education.
Wilkerson, L., & Gijselaers, W. (Eds.). (1996). Bringing problem-based learning to higher
education: Theory and practice. New Directions For Teaching and Learning Series, No. 68.
San Francisco: Jossey-Bass.
                                             227
               Foundations of Learning and Instructional Design Technology
Williams, S. M. (1992). Putting case-based instruction into context: Examples from legal and
medical education. Journal of the Learning Sciences, 2, 367-427.
Woods, D. R. 1994. Problem-based learning: How to gain the most from PBL. Waterdown,
Ontario: Donald R. Woods.
John R. Savery is an assistant professor in the College of Education, the University of Akron.
Email: jsavery@uakron.edu.
Suggested Citation
                                             228
                             John R. Savery
                                        229
                                             19
Connectivism
George Siemens
Editor's Note
    Siemens, G. (2004). Connectivism: A learning theory for the digital age. Retrieved
    from http://www.elearnspace.org/Articles/connectivism.htm
Introduction
Behaviorism, cognitivism, and constructivism are the three broad learning theories most
often utilized in the creation of instructional environments. These theories, however, were
developed in a time when learning was not impacted through technology. Over the last
twenty years, technology has reorganized how we live, how we communicate, and how we
learn. Learning needs and theories that describe learning principles and processes, should
be reflective of underlying social environments. Vaill emphasizes that “learning must be a
way of being—an ongoing set of attitudes and actions by individuals and groups that they
employ to try to keep abreast of the surprising, novel, messy, obtrusive, recurring events . .
.” (1996, p. 42).
Learners as little as forty years ago would complete the required schooling and enter a
career that would often last a lifetime. Information development was slow. The life of
                                             230
               Foundations of Learning and Instructional Design Technology
knowledge was measured in decades. Today, these foundational principles have been
altered. Knowledge is growing exponentially. In many fields the life of knowledge is now
measured in months and years. Gonzalez (2004) describes the challenges of rapidly
diminishing knowledge life:
      One of the most persuasive factors is the shrinking half-life of knowledge. The
      “half-life of knowledge” is the time span from when knowledge is gained to
      when it becomes obsolete. Half of what is known today was not known 10 years
      ago. The amount of knowledge in the world has doubled in the past 10 years and
      is doubling every 18 months according to the American Society of Training and
      Documentation (ASTD). To combat the shrinking half-life of knowledge,
      organizations have been forced to develop new methods of deploying
      instruction.
      Many learners will move into a variety of different, possibly unrelated fields over the
      course of their lifetime.
      Informal learning is a significant aspect of our learning experience. Formal education
      no longer comprises the majority of our learning. Learning now occurs in a variety of
      ways—through communities of practice, personal networks, and through completion
      of work-related tasks.
      Learning is a continual process, lasting for a lifetime. Learning and work related
      activities are no longer separate. In many situations, they are the same.
      Technology is altering (rewiring) our brains. The tools we use define and shape our
      thinking.
      The organization and the individual are both learning organisms. Increased attention
      to knowledge management highlights the need for a theory that attempts to explain
      the link between individual and organizational learning.
      Many of the processes previously handled by learning theories (especially in cognitive
      information processing) can now be off-loaded to, or supported by, technology.
      Know-how and know-what is being supplemented with know-where (the
      understanding of where to find knowledge needed).
Background
Driscoll (2000) defines learning as “a persisting change in human performance or
performance potential…[which] must come about as a result of the learner’s experience and
interaction with the world” (p.11). This definition encompasses many of the attributes
commonly associated with behaviorism, cognitivism, and constructivism—namely, learning
as a lasting changed state (emotional, mental, physiological (i.e., skills)) brought about as a
result of experiences and interactions with content or other people.
                                             231
               Foundations of Learning and Instructional Design Technology
Driscoll (2000, p14–17) explores some of the complexities of defining learning. Debate
centers on:
All of these learning theories hold the notion that knowledge is an objective (or a state) that
is attainable (if not already innate) through either reasoning or experiences. Behaviorism,
cognitivism, and constructivism (built on the epistemological traditions) attempt to address
how it is that a person learns.
Behaviorism states that learning is largely unknowable, that is, we can’t possibly
understand what goes on inside a person (the “black box theory”). Gredler (2001) expresses
behaviorism as being comprised of several theories that make three assumptions about
learning:
Constructivism suggests that learners create knowledge as they attempt to understand their
experiences (Driscoll, 2000, p. 376). Behaviorism and cognitivism view knowledge as
external to the learner and the learning process as the act of internalizing knowledge.
Constructivism assumes that learners are not empty vessels to be filled with knowledge.
Instead, learners are actively attempting to create meaning. Learners often select and
pursue their own learning. Constructivist principles acknowledge that real-life learning is
messy and complex. Classrooms which emulate the “fuzziness” of this learning will be more
effective in preparing learners for life-long learning.
                                              232
               Foundations of Learning and Instructional Design Technology
Learning theories are concerned with the actual process of learning, not with the value of
what is being learned. In a networked world, the very manner of information that we
acquire is worth exploring. The need to evaluate the worthiness of learning something is a
meta-skill that is applied before learning itself begins. When knowledge is subject to
paucity, the process of assessing worthiness is assumed to be intrinsic to learning. When
knowledge is abundant, the rapid evaluation of knowledge is important. Additional concerns
arise from the rapid increase in information. In today’s environment, action is often needed
without personal learning—that is, we need to act by drawing information outside of our
primary knowledge. The ability to synthesize and recognize connections and patterns is a
valuable skill.
Many important questions are raised when established learning theories are seen through
technology. The natural attempt of theorists is to continue to revise and evolve theories as
conditions change. At some point, however, the underlying conditions have altered so
significantly, that further modification is no longer sensible. An entirely new approach is
needed.
Some questions to explore in relation to learning theories and the impact of technology and
new sciences (chaos and networks) on learning:
      How are learning theories impacted when knowledge is no longer acquired in the
      linear manner?
      What adjustments need to be made with learning theories when technology performs
      many of the cognitive operations previously performed by learners (information
      storage and retrieval).
      How can we continue to stay current in a rapidly evolving information ecology?
      How do learning theories address moments where performance is needed in the
      absence of complete understanding?
      What is the impact of networks and complexity theories on learning?
      What is the impact of chaos as a complex pattern recognition process on learning?
      With increased recognition of interconnections in differing fields of knowledge, how
      are systems and ecology theories perceived in light of learning tasks?
                                             233
               Foundations of Learning and Instructional Design Technology
An Alternative Theory
Including technology and connection making as learning activities begins to move learning
theories into a digital age. We can no longer personally experience and acquire learning
that we need to act. We derive our competence from forming connections. Karen
Stephenson states:
      Experience has long been considered the best teacher of knowledge. Since we
      cannot experience everything, other people’s experiences, and hence other
      people, become the surrogate for knowledge. ‘I store my knowledge in my
      friends’ is an axiom for collecting knowledge through collecting people
      (undated).
Chaos is a new reality for knowledge workers. ScienceWeek (2004) quotes Nigel Calder’s
definition that chaos is “a cryptic form of order.” Chaos is the breakdown of predictability,
evidenced in complicated arrangements that initially defy order. Unlike constructivism,
which states that learners attempt to foster understanding by meaning making tasks, chaos
states that the meaning exists—the learner’s challenge is to recognize the patterns which
appear to be hidden. Meaning-making and forming connections between specialized
communities are important activities.
Luis Mateus Rocha (1998) defines self-organization as the “spontaneous formation of well
organized structures, patterns, or behaviors, from random initial conditions.” (p.3).
Learning, as a self-organizing process requires that the system (personal or organizational
learning systems) “be informationally open, that is, for it to be able to classify its own
interaction with an environment, it must be able to change its structure . . .” (p.4). Wiley and
Edwards acknowledge the importance of self-organization as a learning process: “Jacobs
argues that communities self-organize is a manner similar to social insects: instead of
thousands of ants crossing each other’s pheromone trails and changing their behavior
accordingly, thousands of humans pass each other on the sidewalk and change their
behavior accordingly.” Self-organization on a personal level is a micro-process of the larger
self-organizing knowledge constructs created within corporate or institutional
environments. The capacity to form connections between sources of information, and
                                              234
               Foundations of Learning and Instructional Design Technology
thereby create useful information patterns, is required to learn in our knowledge economy.
Albert-László Barabási states that “nodes always compete for connections because links
represent survival in an interconnected world” (2002, p. 106). This competition is largely
dulled within a personal learning network, but the placing of value on certain nodes over
others is a reality. Nodes that successfully acquire greater profile will be more successful at
acquiring additional connections. In a learning sense, the likelihood that a concept of
learning will be linked depends on how well it is currently linked. Nodes (can be fields,
ideas, communities) that specialize and gain recognition for their expertise have greater
chances of recognition, thus resulting in cross-pollination of learning communities.
Weak ties are links or bridges that allow short connections between information. Our small
world networks are generally populated with people whose interests and knowledge are
similar to ours. Finding a new job, as an example, often occurs through weak ties. This
principle has great merit in the notion of serendipity, innovation, and creativity. Connections
between disparate ideas and fields can create new innovations.
Connectivism
Connectivism is driven by the understanding that decisions are based on rapidly altering
foundations. New information is continually being acquired. The ability to draw distinctions
between important and unimportant information is vital. The ability to recognize when new
information alters the landscape based on decisions made yesterday is also critical.
Principles of connectivism:
                                             235
               Foundations of Learning and Instructional Design Technology
Connectivism also addresses the challenges that many corporations face in knowledge
management activities. Knowledge that resides in a database needs to be connected with
the right people in the right context in order to be classified as learning. Behaviorism,
cognitivism, and constructivism do not attempt to address the challenges of organizational
knowledge and transference.
Landauer and Dumais (1997) explore the phenomenon that “people have much more
knowledge than appears to be present in the information to which they have been exposed.”
They provide a connectivist focus in stating “the simple notion that some domains of
knowledge contain vast numbers of weak interrelations that, if properly exploited, can
greatly amplify learning by a process of inference.” The value of pattern recognition and
connecting our own “small worlds of knowledge” are apparent in the exponential impact
provided to our personal learning.
                                             236
               Foundations of Learning and Instructional Design Technology
John Seely Brown presents an interesting notion that the internet leverages the small efforts
of many with the large efforts of few. The central premise is that connections created with
unusual nodes supports and intensifies existing large effort activities. Brown provides the
example of a Maricopa County Community College system project that links senior citizens
with elementary school students in a mentor program. Because the children “listen to these
‘grandparents’ better than they do their own parents, the mentoring really helps the
teachers . . . the small efforts of the many—the seniors—complement the large efforts of the
few—the teachers” (2002). This amplification of learning, knowledge and understanding
through the extension of a personal network is the epitome of connectivism.
Implications
The notion of connectivism has implications in all aspects of life. This paper largely focuses
on its impact on learning, but the following aspects are also impacted:
Conclusion
The pipe is more important than the content within the pipe. Our ability to learn what we
need for tomorrow is more important than what we know today. A real challenge for any
learning theory is to actuate known knowledge at the point of application. When knowledge,
however, is needed, but not known, the ability to plug into sources to meet the requirements
becomes a vital skill. As knowledge continues to grow and evolve, access to what is needed
is more important than what the learner currently possesses.
Connectivism presents a model of learning that acknowledges the tectonic shifts in society
where learning is no longer an internal, individualistic activity. How people work and
function is altered when new tools are utilized. The field of education has been slow to
recognize both the impact of new learning tools and the environmental changes in what it
                                             237
               Foundations of Learning and Instructional Design Technology
means to learn. Connectivism provides insight into learning skills and tasks needed for
learners to flourish in a digital era.
Application Exercises
References
Barabási, A. L., (2002) Linked: The New Science of Networks, Cambridge, MA, Perseus
Publishing.
Brown, J. S., (2002). Growing Up Digital: How the Web Changes Work, Education, and the
Ways People Learn. United States Distance Learning Association. Retrieved on December
10, 2004, from https://edtechbooks.org/-Zw
Driscoll, M. (2000). Psychology of Learning for Instruction. Needham Heights, MA, Allyn &
Bacon.
Gleick, J., (1987). Chaos: The Making of a New Science. New York, NY, Penguin Books.
Gonzalez, C., (2004). The Role of Blended Learning in the World of Technology. Retrieved
December 10, 2004 from https://edtechbooks.org/-Pt.
Gredler, M. E., (2005) Learning and Instruction: Theory into Practice—5th Edition, Upper
Saddle River, NJ, Pearson Education.
Kleiner, A. (2002). Karen Stephenson’s Quantum Theory of Trust. Retrieved December 10,
2004 from https://edtechbooks.org/-cA.
                                            238
               Foundations of Learning and Instructional Design Technology
Landauer, T. K., Dumais, S. T. (1997). A Solution to Plato’s Problem: The Latent Semantic
Analysis Theory of Acquisition, Induction and Representation of Knowledge. Retrieved
December 10, 2004 from https://edtechbooks.org/-yt.
Stephenson, K., (Internal Communication, no. 36) What Knowledge Tears Apart, Networks
Make Whole.Retrieved December 10, 2004 from https://edtechbooks.org/-Mg.
Vaill, P. B., (1996). Learning as a Way of Being. San Francisco, CA, Jossey-Blass Inc.
Suggested Citation
                                             239
                          George Siemens
Dr. George Siemens is the executive director of the Learning Innovation and
Networked Knowledge (LINK) Research Lab at the University of Texas at
Arlington. He received a Ph.D. from the University of Aberdeen on sensemaking
and wayfinding in complex information settings. He is well known for developing
the learning theory of connectivism, as well as for his pioneering work in
learning analytics and massively open online courses (MOOCs). He was among
the first people ever to design and facilitate a MOOC.
                                     240
                                            20
Charles M. Reigeluth
Editor’s Note
This article describes instructional theory that supports post-industrial education and
training systems—ones that are customized and learner-centered, in which student progress
is based on learning rather than time. The author discusses the importance of problem-
based instruction (PBI), identifies some problems with PBI, overviews an instructional
theory that addresses those problems, and describes the roles that should be played by the
teacher, the learner, and technology in the new paradigm.
Introduction
One of the few things that practically everyone agrees on in both education and training is
that people learn at different rates and have different learning needs. Yet our schools and
training programs typically teach a predetermined, fixed amount of content in a set amount
of time. Inevitably, slower learners are forced to move on before they have mastered the
content, and they accumulate deficits in their learning that make it more difficult for them
to learn related content in the future. Also, faster learners are bored to frustration and
waste much valuable time waiting for the group to move on—a considerable squander of
talent that our communities, companies, and society sorely need. A system that was truly
designed to maximize learning would not force learners to move on before they had learned
                                             241
               Foundations of Learning and Instructional Design Technology
the current material, and it would not force faster learners to wait for the rest of the class.
Our current paradigm of education and training was developed during the industrial age. At
that time, we could not afford to educate or train everyone to high levels, and we did not
need to educate or train everyone to high levels. The predominant form of work was manual
labor. In fact, if we educated everyone to high levels, few would be willing to work on
assembly lines, doing mindless tasks over and over again. What we needed in the industrial
age was an educational system that sorted students—one that separated the children who
should do manual labor from the ones who should be managers or professionals. So the
“less bright” students were flunked out, and the brighter ones were promoted to higher
levels of education. This is why our schools use norm-referenced assessment systems rather
than criterion-referenced assessment—to help sort the students. The same applied to our
training systems. We must recognize that the main problem with our education and training
systems is not the teachers or the students, it is the system—a system that is designed more
for sorting than for learning (see Reigeluth, 1987, 1994, for examples).
Problem-based Instruction
Student engagement or motivation is key to learning. No matter how much work the teacher
does, if the student doesn’t work, the student doesn’t learn. The quality and quantity of
learning are directly proportional to the amount of effort the student devotes to learning.
The industrial-age paradigm of education and training was based on extrinsic motivation,
with grades, study halls, detentions, and in the worst cases repeating a grade or flunking
out.
                                              242
               Foundations of Learning and Instructional Design Technology
Furthermore, given the importance of student progress being based on learning rather than
on time, students progress at different rates and learn different things at any given time.
This also lends itself well to PBI, because it is more learner-directed than teacher-directed.
It seems clear that PBI should be used prominently in the new paradigm of education and
training. But there are problems with PBI. I explore those next.
Second, the skills and competencies that we teach through PBI are usually ones that our
learners will need to transfer to a broad range of situations, especially for complex cognitive
tasks. However, in PBI learners typically use a skill only once or twice in the performance of
the project. This makes it difficult for them to learn to use the skill in the full range of
situations in which they are likely to need it in the future. Many skills require extensive
practice to develop to a proficient or expert level, yet that rarely happens in PBI.
Third, some skills need to be automatized in order to free up the expert’s conscious
cognitive processing for higher-level thinking required during performance of a task. PBI
does not address this instructional need.
Finally, much learner time can be wasted during PBI, searching for information and
struggling to learn without sufficient guidance or support. It is often important, not just in
corporate training, but also in K–12 and higher education, to get the most learning in the
least amount of time. Such efficiency is not typically a hallmark of PBI.
Given these four problems with PBI—difficulty ensuring mastery, transfer, automaticity, and
efficiency—does this mean we should abandon PBI and go with direct instruction? To quote
a famous advertisement, “Not exactly.” I now explore this issue.
                                              243
               Foundations of Learning and Instructional Design Technology
Research shows that learning a skill is facilitated to the extent that instruction tells the
students how to do it, shows them how to do it for diverse situations, and gives them
practice with immediate feedback, again for diverse situations (Merrill, 1983; Merrill,
Reigeluth, & Faust, 1979), so the students learn to generalize or transfer the skill to the full
range of situations they will encounter in the real world. Each student continues to practice
until she or he reaches the standard of mastery for the skill. Upon reaching the standard,
the student returns to the “project space” where time is unfrozen, to apply what has been
learned to the project and continue working on it until the next learning gap is encountered,
and this learning-doing cycle is repeated.
Well-validated instructional theories have been developed to offer guidance for the design of
both the project space and the instructional space (see Reigeluth, 1999; Reigeluth & Carr-
Chellman, 2009, for examples). In this way we transcend the either/or thinking so
characteristic of industrial-age thinking and move to both/and thinking, which is better
suited to the much greater complexity inherent in the information age—we utilize
instructional theory that combines the best of behaviorist, cognitivist, and constructivist
theories and models. This theory pays attention to mastery of individual competencies, but it
also avoids the fragmentation characteristic of many mastery learning programs in the past.
One of the problems with most PBI (identified earlier) is that students are assessed on the
quality of the team product. Team assessment is important, but you also need individual
assessment, and the instructional space offers an excellent opportunity to meet this need.
Like the project space, the instructional space is performance oriented. The practice
opportunities (offered primarily in a computer simulation for immediate, customized
feedback and authenticity) continue to be offered to a student until the student reaches the
criterion for number of correct performances in a row required by the standard. Formative
evaluation is provided immediately to the student on each incorrect performance. When
automatization of a skill (Anderson, 1996) is important, there is also a criterion for speed of
performance that must be met.
In this manner, student assessment is fully integrated into the instruction, and there is no
                                              244
               Foundations of Learning and Instructional Design Technology
There is much validated guidance for the design of the project space, including universal
and situational principles for the project space (see, e.g., Barrows, 1986; Barrows &
Tamblyn, 1980; Duffy & Raymer, 201 0; Savery, 2009). They include guidance for selection
of a good problem or project, formation of groups, facilitation of higher learning by a tutor,
use of authentic assessment, and use of thorough debriefing activities. Computer-based
simulations are often highly effective for creating and supporting the project environment,
but the project space could be comprised entirely of places, objects, and people in the real
world (in which case the instructional space could be accessed on a mobile device), or it
could be a combination of virtual and real-world environments. STAR LEGACY (Schwartz,
Lin, Brophy, & Bransford, 1999) is a good example of a computer-based simulation for the
project space.
Selection of instructional strategies in the instructional space is primarily based on the type
of learning (the ends of instruction) involved (see Unit 3 in Reigeluth & Carr-Chellman,
2009). For memorization, drill and practice is most effective (Salisbury, 1990), including
chunking, repetition, prompting, and mnemonics. For application (skills), tutorials with
generality, examples, practice, and immediate feedback are most effective (Merrill, 1983;
Romiszowski, 2009). For conceptual understanding, connecting new concepts to existing
concepts in a student’s cognitive structures requires the use of such methods as analogies,
context (advance organizers), comparison and contrast, analysis of parts and kinds, and
various other techniques based on the dimensions of understanding required (Reigeluth,
1983). For theoretical understanding, causal relationships are best learned through
exploring causes (explanation), effects (prediction), and solutions (problem solving); and
natural processes are best learned through description of the sequence of events in the
natural process (Reigeluth & Schwartz, 1989). These sorts of instructional strategies have
been well researched for their effectiveness, efficiency, and appeal. And they are often best
implemented through computer-based tutorials, simulations, and games.
                                              245
               Foundations of Learning and Instructional Design Technology
This is one vision of instructional theory for the post-industrial paradigm of instruction. I
encourage the reader to try to think of additional visions that meet the needs of the post-
industrial era: principally intrinsic motivation, customization, attainment-based student
progress, collaborative learning, and self-directed learning. To do so, it may be helpful to
consider the ways that roles are likely to change in the new paradigm of instruction.
The teacher’s role has changed dramatically in the new paradigm of instruction from the
“sage on the stage” to the “guide on the side.” I currently see three major roles involved in
being a guide on the side. First, the teacher is a designer of student work (Schlechty, 2002).
The student work includes that which is done in both the project space and the instructional
space. Second, the teacher is a facilitator of the learning process. This includes helping to
develop a personal learning plan, coaching or scaffolding the student’s learning when
appropriate, facilitating discussion and reflection, and arranging availability of various
human and material resources. Third, and perhaps most important in the public education
sector, the teacher is a caring mentor, a person who is concerned with the full, well-rounded
development of the student.
Teacher as designer, facilitator, and mentor are only three of the most important new roles
that teachers serve, but not all teachers need to perform all the roles. Different kinds of
teachers with different kinds and levels of training and expertise may focus on one or two of
these roles (including students as teachers—see next section).
First, learning is an active process. The student must exert effort to learn. The teacher
cannot do it for the student. This is why Schlechty (2002) characterizes the new paradigm
as one in which the student is the worker, and that the teacher is the designer of the
student’s work.
Second, to prepare the student for lifelong learning, the teacher helps each student to
become a self-directed and self-motivated learner. Students are self-motivated to learn from
when they are born to when they first go to school. The industrial-age paradigm
systematically destroys that self-motivation by removing all self-direction and giving
students boring work that is not relevant to their lives. In contrast the post-industrial system
is designed to nurture self-motivation through self-direction and active learning in the
context of relevant interesting projects. Student motivation is key to educational
                                              246
               Foundations of Learning and Instructional Design Technology
productivity and helping students to realize their potential. It also greatly reduces discipline
problems, drug use, and much more.
Third, it is often said that the best way to learn something is to teach it. Students are
perhaps the most under-utilized resource in our school systems. Furthermore, someone who
has just learned something is often better at helping someone else learn it, than is someone
who learned it long ago. In addition to older students teaching slightly younger ones, peers
can learn from each other in collaborative projects, and they can also serve as peer tutors.
Therefore, new student roles include student as worker, self-directed learner, and teacher.
I currently see four main roles for technology to make the new paradigm of instruction
feasible and cost-effective. These roles were first described by Reigeluth and colleagues
(Reigeluth & Carr-Chellman, 2009; Reigeluth et al.,2008). They include record keeping for
student learning, planning for student learning, instruction for student learning, and
assessment for/of student learning. These four roles are seamlessly integrated in a special
kind of learning management system called a Personalized Integrated Educational System.
These four roles are equally relevant in K–12 education, higher education, corporate
training, military training, and education and training in other contexts.
                                              247
               Foundations of Learning and Instructional Design Technology
[https://edtechbooks.org/-oSW]
Military Personnel, one is using a computer-based simulation while the other is on stand-by
It should be apparent that technology will play a crucial role in the success of the post-
industrial paradigm of education. It will enable a quantum improvement in student learning,
and likely at a lower cost per student per year than in the current industrial-age paradigm.
Just as the electronic spreadsheet made the accountant’s job quicker, easier, less expensive,
and more enjoyable, so the kind of technology system described here will make the
teacher’s job quicker, easier, less expensive, and more enjoyable. But the new paradigm of
instructional theory plays an essential role for technology to realize its potential
contribution.
Conclusion
While much instructional theory has been generated to guide the design of the new
paradigm of instruction, much remains to be learned. We need to learn how to better
address the strong emotional basis of learning (Greenspan, 1997), foster emotional and
social development, and promote the development of positive attitudes, values, morals, and
ethics, among other things. It is my hope that you, the reader, will rise to the challenge and
help further advance the knowledge we need to greatly improve our ability to help every
student reach his or her potential.
                                             248
               Foundations of Learning and Instructional Design Technology
Application Exercises
          In the section on “New Roles for Students,” Reigeluth describes three new
          major tasks for students in the post industrial theory. How would your
          schooling (K-12) have been different if your school system had included these
          major tasks? Write a paragraph explaining some of the changes.
          Which of the four challenges facing PBI seems to be the most difficult to
          overcome? Explain your reasoning.
          Reigeluth suggests four areas where technology can greatly assist teachers
          in supporting student learning: 1) record keeping 2) planning 3) instruction
          4) assessing. Chose one of these areas and discuss how you think technology
          could improve the way one of these areas could be done better to improve
          and support student learning.
          What aspects of our current learning system are still from the industrial era?
          Now share your thoughts on what you would recommend to change that to a
          post-industrial environment.
References
Anderson, J. R. (1996). The architecture of cognition. Mahwah, NJ: Lawrence Erlbaum
Associates.
Duffy, T. M., & Raymer, P. L. (201 0). A practical guide and a constructivist rationale for
inquiry based learning. Educational Technology, 50(4), 3–15.
Greenspan, S. I. (1997). The growth of the mind and the endangered origins of intelligence.
Reading, MA: Addison-Wesley Publishing Company.
Merrill, M. D., Reigeluth, C. M., & Faust, G. W. (1979). The Instructional Quality Profile: A
curriculum evaluation and design tool. In H. F. O’Neil, Jr. (Ed.), Procedures for instructional
systems development. New York: Academic Press.
                                             249
               Foundations of Learning and Instructional Design Technology
Reigeluth, C. M. (1987). The search for meaningful reform: A third-wave educational system.
Journal of Instructional Development, 10(4), 3–14.
Reigeluth, C. M., & Schwartz, E. (1989). An instructional theory for the design of computer-
based simulations. Journal of Computer-Based Instruction, 16(1), 1–10.
Reigeluth, C. M., Watson, S. L., Watson, W. R., Dutta, P., Chen, Z., & Powell, N. (2008).
Roles for technology in the information-age paradigm of education: Learning management
systems. Educational Technology, 48(6), 32–39.
Salisbury, D. F. (1990). Cognitive psychology and its implications for designing drill and
practice programs for computers. Journal of Computer-Based Instruction, 17(1), 23–30.
Schlechty, P. (2002). Working on the work. New York: John Wiley & Sons.
Schwartz, D. L., Lin, X., Brophy, 5., & Bransford, J, D. (1999). Toward the development of
flexibly adaptive instructional designs. In C. M. Reigeluth (Ed.), Instructional-design
                                             250
               Foundations of Learning and Instructional Design Technology
theories and models: A new paradigm of instructional theory (Vol. II, pp. 183–213). Mahwah,
NJ: Lawrence Erlbaum Associates.
*I use the term “problem-based instruction” rather than “problem-based learning” because
the latter (PBL) is what the learner does, whereas the former (PBI) is what the teacher or
instructional system does to support the learning. Furthermore, I use the term PBI broadly
to encompass instruction for project-based learning and inquiry learning.
Further Resources
    In 2012, Harvard’s Clayton Christensen gave the Cluff Lecture at Brigham Young
    University on the topic of disruptive innovation, which can be viewed at
    https://vimeo.com/39639865. This lecture draws on ideas from his many books,
    including Disruptive Class, [https://edtechbooks.org/-se] which has many parallels
    to Dr. Reigeluth’s ideas.
Suggested Citation
                                            251
       Foundations of Learning and Instructional Design Technology
                                   252
                      Charles M. Reigeluth
                                      253
                                             21
M. David Merrill
For over 50 years my career has been focused on one very important question: “What makes
instruction effective, efficient, and engaging?” I decided that e-learning should refer to the
quality of the instruction, not merely to how it is delivered, so I labeled effective, efficient,
and engaging instruction as e3 instruction. In this brief presentation I will try to share a
little of what I’ve learned. Perhaps the underlying message of my studies and this
presentation is this simple statement: “Information alone is not instruction!”
In 1964, in our research lab at the University of Illinois, we were sending messages from
one computer to another via ARPANET. Little did we realize the fantastic potential of this
experimental communication from computer to computer. Unfortunately for our subsequent
fortunes, none of us in that lab envisioned the Internet and the World Wide Web and the
impact that this invention would have on communication, the availability of information,
social interaction, commerce, education, and almost every other aspect of our lives.
But today, thanks to the Internet, interested learners can find information about almost
anything in the world, whether current events or historical events. Teaching American
                                              254
               Foundations of Learning and Instructional Design Technology
history to junior high students today would be so much easier because of the almost
unlimited amount of information in all different media that is available, including audio,
video, animation, as well as text. But is access to this wealth of information instruction?
What I’ve learned from my study of this question is that the answer is an emphatic NO! I
repeat, Information alone is not instruction.
Motivation
All of us have heard the saying that “students didn’t learn because they just weren’t
motivated.” Or that “motivation is the most important part of learning.” Or “we really need
to find a way to motivate our students.” What is it that causes motivation? People have often
asked me, “Is motivation one of your first principles of instruction?” The answer is no;
motivation is not something we can do, motivation is an outcome. So, if it is an outcome,
what causes motivation? Motivation comes from learning; the greatest motivation comes
when people learn. We are wired to learn; all of us love to learn; every student loves to
learn. And, generally, we are motivated by those things that we find we are good at. For
example, I’m not much of an athlete. I look back on my past and ask, “Why am I not an
athlete?” I remember that I was very small as a child. In my elementary school we used to
divide up into teams during recess to play softball. I always ended up as last shag on the
girls’ team. That was very embarrassing for me, so, I lost interest in sports; I did not want to
be a sports person. Consequently, I never pursued sports. On the other hand, somewhere in
my youth I was given a scale model train. I was very interested in trains, but in this case one
of my father’s friends showed me how to build scenery and how to make a model railroad
that looked like the real world. I became very interested in building a model railroad. I have
continued to follow this interest throughout my life. Why was I motivated to do this?
Because I was good at it, because I learned things about how to build a realistic model. The
more I learned, the more interested I became. We need to find ways to motivate our
students, and that comes from promoting learning. Learning comes when we apply the
effective and engaging principles of instruction.
                                              255
               Foundations of Learning and Instructional Design Technology
The course or module consists of a list of topics representing the content of the course.
Information about the topic is presented, represented by the arrows. Occasionally a quiz or
exercise is inserted to help illustrate the topic, represented by the boxes. The sequence is to
teach one topic at a time. At the end of the course or module there is a culminating final
test, or in some cases a final project, that asks the students to apply the topic to complete
some task or solve some problem.
Sometimes this sequence is very effective in enabling students to gain skills or to learn to
solve problems. Too often, however, this sequence is ineffective and not engaging for
students. The effectiveness of this sequence and the degree of engagement it promotes for
learners depends on the type of learning events that are represented by the arrows and the
boxes in this diagram.
Instructional Events
There are many different types of instructional or learning events. Perhaps the most
frequently used learning event is to present information or Tell. This Tell can take many
forms, including lectures, videos, text books, and PowerPoint presentations.
The next most frequent instructional or learning event is to have learners remember what
they were told, what they read, or what they saw. This remember instructional event we will
label as Ask. Even though Tell and Ask are the most frequently used instructional events, if
                                             256
               Foundations of Learning and Instructional Design Technology
they are the only instructional events used then the Tell–Ask instructional sequence is the
least effective instructional strategy.
If the arrows in Figure 1 represent Tell learning events and the boxes represent Ask
learning events, then this module is not going to be very effective and most likely will not
prepare learners to adequately complete a project using the information taught. If the
culminating learning activity is an Ask final exam, learners may be able to score well on this
exam. However, a good score on an Ask exam does little to prepare learners to apply the
ideas taught to the solution of a complex problem or completion of a complex task.
I took the challenge and spent the next year or two studying these various instructional
theories. The result was the publication in 2002 of my often-referenced paper on First
principles of Instruction (Merrill, 2002). I have spent the time since in refining my
proposition in a series of papers and chapters on First Principles. In 2013, I finally published
my book First Principles of Instruction (Merrill, 2013) that elaborated these principles,
provided a set of suggestions for how these principles might be implemented in various
models of instruction, and provided a variety of instructional samples that illustrate the
implementation of First Principles in a range of content areas and in different educational
contexts, including training, public schools, and higher education.
Activation: Learning is promoted when learners activate a mental model of their prior
knowledge as a foundation for new skills. A frequently cited axiom of education is to start
                                              257
               Foundations of Learning and Instructional Design Technology
where the learner is. Activation is the principle that attempts to activate a relevant mental
model already acquired by the learner in order to assist him or her to adapt this mental
model to the new skills to be acquired.
Integration: Learning is promoted when learners share, reflect on, and defend their work by
peer-collaboration and peer-critique. Deep learning requires learners to integrate their
newly acquired skills into those mental models they have already acquired. One way to
insure this deep processing is for learners to collaborate with other learners in solving
problems or doing complex tasks. Another learning event that facilitates deep processing is
when learners go public with their knowledge in an effort to critique other learners or to
defend their work when it is critiqued by other learners.
Do First Principles of Instruction actually promote more effective, efficient, and engaging
instruction?
A study conducted by NETg (Thompson Learning, 2002), a company that sells instruction to
teach computer applications, compared their off-the-shelf version of their Excel instruction,
which is topic-centered, with a problem-centered version of this course that was developed
following First Principles. Participants in the experiment came from a number of different
companies that were clients of NETg. The assessment for both groups consisted of
                                             258
               Foundations of Learning and Instructional Design Technology
developing a spreadsheet for three real-world Excel problems. The problem-centered group
scored significantly higher, required significantly less time to complete the problems, and
expressed a higher level of satisfaction than the topic-centered group. All differences were
statistically significant beyond the .001 level.
The conclusion that can be drawn from these three different and independent studies of
First Principles clearly shows that courses based on First Principles do facilitate
effectiveness, efficiency, and learner satisfaction.
Demonstration Principle
When I’m asked to review course material, my approach is to immediately turn to Module 3
of the material. By then the course is usually into the heart of the content, and the
introductory material is finished. What do I look for first? Examples. Does the content
include examples, demonstrations, or simulations of the ideas being taught? Adding
demonstration to a course will result in a significant increment in the effectiveness of the
course.
Do most courses include such demonstration? MOOCs are a recent very popular way to
deliver instruction. How well do these Massive Open Online Courses implement First
Principles of Instruction? Anoush Margaryan and her colleagues (Margaryan, Bianco,
Littlejohn, 2015) published an important paper titled Instructional Quality of Massive Online
Courses (MOOCs) that addresses this question. They carefully analyzed 76 MOOCs
representing a wide variety of content sponsored by a number of different institutions to
determine the extent that these courses implemented First Principles of Instruction. Their
                                             259
               Foundations of Learning and Instructional Design Technology
overall conclusion was that most of these courses failed to implement these principles.
Application Principle
When I’m asked to review a course, the second type of learning event I look for is
application that is consistent with and appropriate for the type of learning involved.
Remembering a definition or series of steps is not application. There are two types of
application that are most important but too often not included. DOid or DOidentify requires
learners to recognize new divergent examples of an object or event when they encounter it.
DOidentify is also the initial application required when learning the steps of a procedure or
process. The learner must first recognize a correctly executed step when they see it, and
they must also recognize the consequence that resulted from the execution of the step. Once
they can recognize appropriate steps and appropriate consequences for these steps, then
DOexecute is the next level of application. DOexecute requires learners to actually perform
or execute the steps of a procedure. When appropriate application is missing, the
effectiveness of a course is significantly increased when appropriate application learning
events are added.
MOOCs are often about teaching learners new skills. Did the MOOCs in the study cited
above include appropriate application for these skills? They fared better than they did for
demonstration. At least 46 of the 76 MOOCs did include some form of application. This still
leaves 30 MOOCs in this study without application of any kind. However, on careful analysis
of the sufficiency and appropriateness of the application included, it was found that only 13
of the MOOCs in this study had appropriate and sufficient application.
Learning Events
While Tell and Ask are the most frequently used learning events, as we have seen, a strategy
that uses only these two learning events is not an effective or engaging strategy. Learning
to solve problems and to do complex tasks is facilitated when a Tell instructional strategy is
enhanced by adding demonstration or Show learning events. A Tell-Show sequence is more
effective than a Tell only sequence.
Learning to solve problems and to do complex tasks is facilitated even more when a Tell-
Show strategy is further enhanced by adding Do instructional events. These Do learning
events are most appropriate when they require learners to identify unencountered instances
of some object or event (DOidentify learning events) and when they require learners to
execute the steps in a procedure or observe the steps in a process (DOexecute learning
events). A Tell-Show-Do sequence is even more effective than a Tell-Show instructional
                                             260
               Foundations of Learning and Instructional Design Technology
sequence.
Second, identify the Tell information for each topic and reference it in the Tell column.
Review this information to ensure that each topic is accurate and sufficient for the goals of
the instruction.
Third, identify existing Show learning events for each topic. If the existing instruction does
not include appropriate or sufficient examples of each of the concepts, principles,
procedures, or processes listed, then identify or create appropriate examples for inclusion in
the module. Creating a matrix to use as a cross reference for the new content examples can
help identify areas where new activities need to be placed in the course.
Fourth, identify existing Do learning events for each topic. If the existing instruction does
not include appropriate or sufficient Do learning events, then identify or create appropriate
Do-identify or Do-execute learning events for inclusion in the module.
Finally, assemble the new demonstrations and applications into the module for more
effective, efficient, and engaging instruction.
Even after appropriate demonstration and application learning events are added to this
traditional instructional sequence, there is still a potential problem that keeps this
instructional sequence from being as effective, efficient, and engaging as possible. In this
                                              261
               Foundations of Learning and Instructional Design Technology
sequence topics are taught one-on-one. The demonstration and application learning events
added to a Tell sequence are usually examples that apply to only a single component skill
and are merely a small part of solving a whole problem. Too often learners fail to see the
relevance of some of these individual skills learned out of context. We have all experienced
the often used explanation: “You won’t understand this now, but later it will be very
important to you.” If “later” in this situation is several days or weeks there is a good
possibility that the learners will have forgotten the component skill before they get to
actually use this skill in solving a whole problem or doing a whole task. Or, if learners do not
see the relevance of a particular skill they may fail to actually learn the skill or they are
unable to identify a mental model into which they can incorporate this skill. Then, when it is
time to use this skill in the solution of a whole problem, learners are unable to retrieve the
skill because it was merely memorized rather than understood. Furthermore, if solving a
whole problem or doing a whole task is the final project for a module or course, there may
be no opportunity to get feedback and revise the project.
Is there a better sequence that is more effective, efficient, and engaging than this typical
sequence?
Problem-centered
To maximize engagement in learning a new problem solving skill, learners need to acquire
these skills in the context of the problem they are learning to solve or the task they are
learning to complete. If learners first activate a relevant mental model (activation principle)
and then are shown an example of the problem they will learn to solve and how to solve this
problem, they are more likely to see the relevance of each individual component skill when
it is taught, and they will have a framework into which they can incorporate this new skill,
greatly increasing the probability of efficient retrieval and application when they are
confronted with a new instance of the problem.
                                              262
               Foundations of Learning and Instructional Design Technology
A typical instructional sequence is topic-centered; that is, each topic is taught one-by-one,
and then at the end of the module or course learners are expected to apply each of these
topics in the solution of a final problem or the completion of a final task. Figure 2 illustrates
a problem-centered sequence that turns this sequence around. Rather than telling an
objective for the module, which is a form of information, the (1) first learning activity is to
show a whole instance of the problem that learners are being taught to solve. This
demonstration also provides an overview of the solution to the problem or the execution of
the task. (2) Students are then told information about the component skills necessary for the
solution of this instance of the problem and (3) shown how each of these component skills
contributes to the solution of the problem. (4) After this Tell–Show demonstration for the
first instance of the problem is complete, a second problem instance is identified and shown
                                              263
               Foundations of Learning and Instructional Design Technology
to learners. (5) The learner is then required to apply the previously acquired component
skills to this second problem (Do). (6) Some of the component skills may require some
additional information or a different way of using the skill to solve this second instance of
the problem. Learners are then told this new information and (7) shown its application to
another instance of the problem. Note that the Tell-Show-Do for each component skill or
topic is now distributed across different instances of the problem. The first instance of the
problem was primarily Tell-Show. The second instance of the problem is a combination of
Tell-Show for new parts of each component skill and Do for those component skills already
acquired. (8) Additional instances of the problem are identified. Learners apply those skills
already acquired (Tell-Show) and apply those skills already acquired (Do) for each new
instance of the problem. The sequence is complete when learners are required to solve a
new instance of the problem without additional guidance.
In a problem-centered instructional sequence learners are more likely to see the relevance
of each new component skill. This sequence will provide multiple opportunities for learners
to apply these newly acquired component skills in the context of real instances of the
problem. It enables learners to see the relationship among the individual component skills in
the context of each new instance of the problem. It also provides gradually diminishing
guidance to learners until they are able to solve a new instance of the problem with little
guidance.
Instruction that is revised to include a Tell-Show-Do sequence of learning events all in the
context of solving a progression of instances of a whole problem or a whole task has the
potential ofmaximally engaging students while providing efficient and effective learning
activities.
Recommendation
In summary: Designers may want to analyze their courses. Perhaps the effectiveness,
efficiency, and especially the engagement of a course may be enhanced by adding
appropriate demonstration and application and by using a problem-centered instructional
sequence. Does the course include appropriate and adequate demonstration? Does it include
appropriate and adequate application? Are the skills taught in the context of an increasingly
complex progression of instances of the problem?
Conclusion
Motivation is an outcome, not a cause. What promotes engagement and hence motivation?
Effective, efficient, and engaging instruction. What promotes effective, efficient, and
engaging instruction? First Principles of Instruction: Activation, Demonstration, Application,
Integration, and Problem-centered. In this paper we have emphasized the demonstration
and application principle and a problem-centered instructional sequence.
                                             264
               Foundations of Learning and Instructional Design Technology
References
Frick, T., Chadha, R., Watson, C., & Zlatkovska, E. (2010). Improving course evaluations to
improve instruction and complex learning in higher education. Educational Technology
Research and Development, 58, 115-136.
Thompson Learning (2002). Thompson Job Impact Study: The Next Generation of Learning.
Retrieved from
http://www.delmarlearning.com/resources/Job_Impact_Study_whitepaper.pdf.
Suggested Citation
                                              265
          Foundations of Learning and Instructional Design Technology
                                        266
                           M. David Merrill
                                       267
                                   III. Design
Andrew Gibbons and Vic Bunderson (2005) wrote a classic article on three ways of seeking
knowledge about the real world: through exploration (often with qualitative research
methods), through explanation (often through quantitative methods) and design. As LIDT
professionals, we consider design and design knowledge to be core to our work, and key to
our understanding of teaching and learning. At our core, we are interventionists: we do not
simply observe the world, but seek to influence it in effective ways. This is done through
design processes and design research, which is the focus of this section. This section begins
with a chapter on classic instructional design approaches, followed by a look at more
current perspectives on design thinking and agile design. You will also read about some
current issues in the field around design, including design mindsets, design-based research,
how to design for effective systemic change, makerspace design, and user experience
design. Included also is a chapter on Human Performance Technology, which is a similar
field to our own, applying many of the same skill sets and knowledge bases in slighty
different ways to the world of corporate learning.
References
Gibbons, A. S., & Bunderson, C. V. (2005). Explore, explain, design.Encyclopedia of social
measurement,1, 927-938.
                                            268
                                             22
Tonia A. Dousay
Researchers and practitioners have spent the past 50 years attempting to define and create
models of design with the intent to improve instruction. As part of a joint, inter-university
project, Barson (1967) defined instructional development as the systematic process for
improving instruction. Perhaps most interesting about this project and subsequent report is
the caution that many different conditions influence learning, including the use of media,
and that generalizing any sort of model would potentially be hazardous at best and
disastrous at worst. Shortly thereafter, however, Twelker, Urbach, and Buck (1972) noted
that a systematic approach to developing instruction was an increasingly popular idea, but
cautioned that instructional design (ID) methods varied from simple to complex. These
historical observations predicted the reality that every instructional design project is unique
every time with no two projects ever progressing through the process identically. These
differences, sometimes subtle while at other times significant, have given way to literally
dozens of different models used with varying popularity in a wide variety of learning
contexts.
                                             269
               Foundations of Learning and Instructional Design Technology
Figure 1. Mushrooms
In the midst of this explosion of models and theories, Gustafson (1991) drafted his first
monograph that would go on to become the Survey of Instructional Development Models,
now in its fifth edition (Branch & Dousay, 2015). The book provides brief overviews of
instructional design models, classifying them within the context of classroom product- and
process-oriented instructional problems. The Surveys book provides a concise summary to
help beginning instructional designers visualize the different design approaches as well as
assist more advanced instructional designers. However, this text is just one of many often
used in the study and practice of instructional design, and those seeking to expand their
knowledge of design process can learn much from the rich history and theoretical
development over decades in our field. (See Resources section for suggestions.) In this
chapter, we explore a brief history of instructional design models, common components of
models, commonly referenced models, and resources and advice for instructional designers
as they engage in the instructional design process.
                                            270
               Foundations of Learning and Instructional Design Technology
Historical Context
The field of Learning and Instructional Design Technology (LIDT) has had many periods of
rapid development. Reiser (2001) noted that training programs during World War II sparked
the efforts to identify efficient, systematic approaches to learning and instructional design.
It would be another 20 years before the first models emerged, but the 1960s and 1970s gave
way to extracting instructional technology and design processes from conversations about
multimedia development (Reiser, 2017), which in turn produced more than three dozen
different instructional design models referenced in the literature between 1970 and 2005
(Branch & Dousay, 2015; Gustafson, 1991, 1991; Gustafson & Branch, 1997, 2002). These
models help designers, and sometimes educational stakeholders, simplify the complex
reality of instructional design and apply generic components across multiple contexts
(Gustafson & Branch, 2002), thus creating standardized approaches to design within an
organization. In turn, Molenda (2017) noted that the standardization of processes and
terminology triggered interest in the field. Thus, an interesting relationship exists between
defining the field of instructional design and perpetuating its existence. As designers seek to
justify their role in education–whether K-12, higher education, or industry–they often refer
to existing models or generate a new model to fit their context. These new models then
become a reference point for other designers and/or organizations.
Despite some claims that classic instructional design is dead, or at least seriously ill (Gordon
& Zemke, 2000), there remains considerable interest in and enthusiasm for its application
(Beckschi & Doty, 2000). This dichotomous view situates the perceived ongoing debate
between the theory of instructional design and its practice and application. On one hand,
scholars and faculty in higher education often continue to research and practice based upon
historical foundations. On the other hand, scholars and practitioners in industry often
eschew the traditional literature, favoring instead more business-oriented practices.
Looking at the authors of various texts consulted in higher education (see Branch, 2009;
Carr-Chellman & Rowland, 2017; Richey, Klein, & Tracey, 2010 for examples) versus those
consulted in industry (see Allen & Seaman, 2013; Biech, 2014; Carliner, 2015; Hodell, 2015
for examples) confirms this dichotomy. New professionals entering the field, should be
aware of this tension and how they may help mitigate potential pitfalls from focusing either
too much on foundational theory or too much on practitioner wisdom. Both are essential to
understanding how to design instruction for any given audience.
                                              271
                 Foundations of Learning and Instructional Design Technology
Notice the use of the phrase process rather than model. For instructional design purposes, a
process is defined as a series of steps necessary to reach an end result. Similarly, a model is
defined as a specific instance of a process that can be imitated or emulated. In other words,
a model seeks to personalize the generic into distinct functions for a specific context. Thus,
when discussing the instructional design process, we often refer to ADDIE as the
overarching paradigm or framework by which we can explain individual models. The
prescribed steps of a model can be mapped or aligned back to the phases of the ADDIE
                                              272
               Foundations of Learning and Instructional Design Technology
process.
Consider the following examples. The Plan, Implement, Evaluate (PIE) model from Newby,
Stepich, Lehman, and Russell (1996) encourages an emphasis on considering how
technology assists with instructional design, focusing on the what, when, why, and how. This
phase produces an artifact or plan that is then put into action during implementation
followed by evaluating both learner performance and instruction effectiveness. During
planning, designers work through a series of questions related to the teacher, learner, and
technology resources. The questions are answered while also taking into consideration the
implementation and evaluation components of the instructional problem. When considered
through the lens of the ADDIE process, PIE combines the analyzing, designing, and
developing phases into a singular focus area, which is somewhat illustrated by the depiction
in Figure 3. Similarly, the Diamond (1989) model prescribes two phases: “Project Selection
and Design” and “Production, Implementation, and Evaluation for Each Unit.” Phase I of the
                                            273
               Foundations of Learning and Instructional Design Technology
Diamond model essentially combines analyzing and designing, while Phase II combines
developing, implementing, and evaluating. (See Figure 4 for a depiction of the model.)
Diamond placed an emphasis on the second phase of the model by prescribing an in-depth,
parallel development system to write objectives, design evaluation instruments, select
instructional strategies, and evaluate existing resources. Then, as new resources are
produced, they are done so with consideration to the previously designed evaluation
instruments. The evaluation is again consulted during the implementation, summative
evaluation, and revision of the instructional system. These two examples help demonstrate
what is meant by ADDIE being the general process and models being specific applications.
(For further discussion of how aspects of specific models align with the ADDIE process, see
Dousay and Logan (2011).)
This discussion might also be facilitated with a business example. Consider the concept of
process mapping; it helps organizations assess operational procedures as they are currently
practiced (Hunt, 1996). Mapping the process analytically to identify the steps carried out in
practice leads to process modeling, an exercise in optimization. In other words, modeling
helps move processes to a desired state tailored to the unique needs of an organization.
                                             274
                Foundations of Learning and Instructional Design Technology
Many businesses of a similar type find that they have similar processes. However, through
process modeling, their processes are customized to meet their needs.
The relationship between ADDIE and instructional design models functions much like this
business world scenario. As instructional designers, we often follow the same process
(ADDIE). However, through modeling, we customize the process to meet the needs of our
instructional context and of our learners, stakeholders, resources, and modes of delivery.
Models assist us in selecting or developing appropriate operational tools and techniques as
we design.
Models
Because there are so many different ID models, how do we choose which one to use? In
framing this conversation, the Survey of ID models (Branch & Dousay, 2015) serves as a
foundation, but by no means should be the sole reference. A total of 34 different
instructional design models (see Table 1 for a summary) have been covered in the Survey
text since its first edition, and this list does not include every model. Still, this list of models
is useful in providing a concise guide to some of the more common approaches to
instructional design.
Table 1
                                                275
               Foundations of Learning and Instructional Design Technology
When considering the models featured in Table 1, determining which one to use might best
be decided by taking into account a few factors. First, what is the anticipated delivery
                                            276
               Foundations of Learning and Instructional Design Technology
format? Will the instruction be synchronous online, synchronous face to face, asynchronous
online, or some combination of these formats? Some models are better tailored for online
contexts, such as Dick and Carey (1978); Bates (1995); Dabbagh and Bannan-Ritland (2004);
or Morrison, Ross, Kemp, Kalman, and Kemp (2012). Another way to think about how to
select a model involves accounting for the context or anticipated output. Is the instruction
intended for a classroom? In that case, consider Gerlach and Ely (1971); ASSURE
(Smaldino, Lowther, Mims, & Russell, 2015); PIE (Newby et al., 1996); UbD (Wiggins &
McTigue, 2000); 4C/ID (van Merriënboer & Kirschner, 2007); or 3PD (Sims & Jones, 2002).
Perhaps the instructional context involves producing an instructional product handed over
to another organization or group. In this case, consider Bergman and Moore (1990); de
Hoog et al. (1994); Nieveen (1997); Seels and Glasgow (1997); or Agile (Beck et al., 2001).
Lastly, perhaps your context prescribes developing a system, such as a full-scale curriculum.
These instructional projects may benefit from the IPISD (Branson et al., 1975); Gentry
(1993); Dorsey et al. (1997); Diamond (1989); Smith and Ragan (2004); or Pebble in the
Pond (Merrill, 2002) models. Deciding which model to use need not be a cumbersome or
overwhelming process. So long as a designer can align components of an instructional
problem with the priorities of a particular model, they will likely be met with success
through the systematic process.
Other ID Models
                                            277
               Foundations of Learning and Instructional Design Technology
While we cannot possibly discuss all of the ID models used in practice and/or referenced in
the literature, there are a few other instructional design models that are useful to mention
because of their unique approaches to design. For example, Plomp’s (1982) OKT model (see
Figure 5), which is taught at the University of Twente in The Netherlands, looks quite
similar to the ADDIE process, but adds testing/revising the instructional solution prior to full
implementation. When OKT was initially introduced, online or web-based instructional
design had not yet become part of the conversation. Yet, his model astutely factors in the
technology component not yet commonly seen in other ID models referenced at the time.
Notice how the OKT process calls for a close relationship between implementation and the
other phases as well as alignment between evaluation and the other phases. This design
facilitates internal consistency in decision making. The intent here was to ensure that
design decisions relating to technology-based resources were consistently applied across
                                              278
               Foundations of Learning and Instructional Design Technology
At their core, instructional design models seek to help designers overcome gaps in what is
learned due to either instruction, motivation, or resources. Thus, some models seek to
address non-instructional gaps, like motivation. See Keller’s (2016) work on motivational
design targeting learner attention, relevance, confidence, satisfaction, and volition (ARCS-
V). Other models examine strategies related to resources, like technology or media
integration. Examples here include Action Mapping (Moore, 2016); Substitution,
Augmentation, Modification, Redefinition (SAMR) Model (see Hamilton, Rosenberg, &
Akcaoglu, 2016 for a discussion); and TPACK-IDDIRR model (Lee & Kim, 2014). And still
other models consider other gaps and needs like rapid development. (See the Successive
Approximation Model (SAM) from Allen Interaction, n.d.)
Recently, many instructional designers have emphasized the design gaps in ID, drawing
upon the broader field of design theory to guide how designers select and arrange
constructs or components. One model, known as Design Layers (Gibbons, 2013), helps
designers prioritize concerns encountered during the ID process and may overlay with an
existing or adapted ID model being followed. In other words, a designer may use design
layers to organize the problems to be addressed, but still use other models based on ADDIE
processes to solve some of these problems. While unintentional, the field of instructional
design often focuses on corporate and adult learning contexts, sometimes feeling
exclusionary to the K-12 instructional designer (note: UbD, Wiggins & McTigue, 2000, is one
of the more well-known ID models also used by K-12 teachers and instructional facilitators).
Carr-Chellman’s (2015) Instructional Design for Teachers (ID4T) model and Larson and
Lockee’s (2013) Streamlined ID represent attempts to break down some of the complex
perceptions of ID, making it more accessible for K-12 teachers and newer instructional
designers.
The primary takeaway from this entire discussion should be that ID is rarely a simple
process. In practice, designers often draw upon personal experience and the wide variety of
models, strategies, and theories to customize each instance of instructional design.
      Focus on the systematic and iterative process of instructional design. Models are not
      discrete steps to be checked off. [Kay Persichitte, University of Wyoming]
      The ADDIE paradigm is fundamental to most models, with appropriate evaluation of
                                             279
         Foundations of Learning and Instructional Design Technology
                                       280
               Foundations of Learning and Instructional Design Technology
      professor at the University of Georgia, used to call the “COIK” phenomenon; Clear
      Only If Known. This phenomenon encourages breaking down complex language,
      avoiding jargon, and making expert knowledge accessible. These tasks are not easy,
      but must be part of the process. [Marshall Jones, Winthrop University]
Acknowledgement
Thanks to Jeroen Breman, Northwest Lineman College, for the OKT-model recommendation.
Application Exercises
          While processes and models can be useful, why do you think it is important to
          maintain flexibility in designing instruction?
          What are some things to consider when selecting an instructional design
          model?
References
Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in
the United States. Babson Park, MA.
Allen Interaction. (n.d.). Agile elearning development with SAM. Retrieved August 25, 2017,
from http://www.alleninteractions.com/sam-process
ATD Research. (2015). Skills, challenges, and trends in instructional design. Alexandria, VA.
Retrieved from
https://www.td.org/Publications/Research-Reports/2015/Skills-Challenges-and-Trends-in-Inst
ructional-Design
Baker, R. E., & Schutz, R. L. (1971). Instructional product development. New York, NY: Van
Nostrand Reinhold Company.
Bates, A. W. (1995). Technology, open learning and distance education. New York, NY:
Routledge.
Beck, K., Beedle, M., van Bennekum, A., Cockburn, A., Cunningham, W., Fowler, M., …
                                            281
               Foundations of Learning and Instructional Design Technology
Beckschi, P., & Doty, M. (2000). Instructional systems design: A little bit of ADDIEtude,
please. In G. M. Piskurich, P. Beckschi, & B. Hall (Eds.), The ASTD handbook of training
design and delivery (pp. 28–41). New York, NY: McGraw-Hill.
Biech, E. (Ed.). (2014). ASTD Handbook (2nd ed.). Alexandria, VA: Association for Talent
Development.
Blake, R. R., & Mouton, J. S. (1971). OD-Fad or fundamental? Madison, WI: American
Society for Training and Development, Inc.
Branch, R. M. (2009). Instructional design: The ADDIE approach. New York: Springer
International Publishing.
Branch, R. M., & Dousay, T. A. (2015). Survey of instructional design models (5th ed.).
Bloomington, IN: Association for Educational Communications & Technology.
Branson, R. K., Rayner, G. T., Cox, L., Furman, J. P., & King, F. J. (1975). Interservice
procedures for instructional systems development. Executive summary and model.
Springfield, VA: National Technical Information Service.
Briggs, L. J. (1970). Handbook of procedures for the design of instruction. Pittsburgh, PA:
American Institutes for Research.
Carliner, S. (2015). Training Design Basics (2nd ed.). Alexandria, VA: Association for Talent
Development.
Carr-Chellman, A. A., & Rowland, G. (Eds.). (2017). Issues in technology, learning, and
instructional design: Classic and contemporary dialogues. New York, NY: Taylor & Francis.
Dabbagh, N., & Bannan-Ritland, B. (2004). Online learning: Concepts, strategies, and
application. Upper Saddle River, NJ: Pearson Education, Inc.
                                              282
               Foundations of Learning and Instructional Design Technology
Davis, R. H., Alexander, L. T., & Yelon, S. L. (1974). Learning systems design: An approach
to the improvement of instruction. New York, NY: McGraw-Hill.
de Hoog, R., de Jong, T., & de Vries, F. (1994). Constraint-driven software design: An escape
from the waterfall model. Performance Improvement Quarterly, 7(3), 48–63.
https://doi.org/10.1111/j.1937-8327.1994.tb00637.x
Dick, W., & Carey, L. (1978). The systematic design of instruction (1st ed.). Chicago: Scott,
Foresman and Company.
Dick, W., & Reiser, R. A. (1989). Planning effective instruction. Upper Saddle River, NJ:
Prentice-Hall.
Dorsey, L. T., Goodrum, D. A., & Schwen, T. M. (1997). Rapid collaborative prototyping as
an instructional development paradigm. In C. R. Dills & A. J. Romiszowski (Eds.),
Instructional development paradigms (pp. 445–465). Englewood Cliffs, NJ: Educational
Technology Publications.
Dousay, T. A., & Logan, R. (2011). Analyzing and evaluating the phases of ADDIE. In
Proceedings from Design, Development and Research Conference 2011 (pp. 32–43). Cape
Town, South Africa.
Gagné, R. M., Wager, W. W., Golas, K. C., & Keller, J. M. (2004). Principles of instructional
design (5th ed.). Boston, MA: Cengage Learning.
Gerlach, V. S., & Ely, D. P. (1971). Teaching and media: A systematic approach (1st ed.).
Upper Saddle River, NJ: Prentice Hall, Inc.
Gilbert, T. F. (1978). Human competence: Engineering worthy performance. New York, NY:
McGraw-Hill.
Gordon, J., & Zemke, R. (2000). The attack on ISD. Training, 37(4), 43–53.
                                             283
               Foundations of Learning and Instructional Design Technology
Gustafson, K. L., & Branch, R. M. (1997). Survey of instructional development models (3rd
ed.). Syracuse, NY: Syracuse University.
Gustafson, K. L., & Branch, R. M. (2002). Survey of instructional development models (4th
ed.). Syracuse, NY: ERIC Clearinghouse on Information & Technology.
Hamilton, E. R., Rosenberg, J. M., & Akcaoglu, M. (2016). The Substitution Augmentation
Modification Redefinition (SAMR) model: A critical review and suggestions for its use.
TechTrends, 60(5), 433–441. https://doi.org/10.1007/s11528-016-0091-y
Heinich, R., Molenda, M., & Russell, J. D. (1982). Instructional media: The new technologies
of instruction (1st ed.). Hoboken, NJ: John Wiley & Sons, Inc.
Hodell, C. (2015). ISD from the ground up (4th ed.). Alexandria, VA: Association for Talent
Development.
Hunt, V. D. (1996). Process mapping: How to reengineer your business process. New York:
John Wiley & Sons, Inc.
Keller, J. M. (2016). Motivation, learning, and technology: Applying the ARCS-V motivation
model. Participatory Educational Research, 3(2), 1–15.
https://doi.org/10.17275/per.16.06.3.2
Kemp, J. (1977). Instructional design: A plan for unit and course development. Belmont, CA:
Fearon Publishers.
Larson, M. B., & Lockee, B. B. (2013). Streamlined ID: A practical guide to instructional
design. New York, NY: Routledge.
Lee, C.-J., & Kim, C. (2014). An implementation study of a TPACK-based instructional design
model in a technology integration course. Etr&D-Educational Technology Research and
Development, 62(4), 437–460. https://doi.org/10.1007/s11423-014-9335-8
Leshin, C. B., Pollock, J., & Reigeluth, C. M. (1992). Instructional design: Strategies &
tactics for improving learning and performance. Englewood Cliffs, NJ: Educational
Technology Publications.
Mager, R. F. (1968). Developing attitude toward learning. Palo Alto, CA: Fearon Publishers.
                                              284
               Foundations of Learning and Instructional Design Technology
Classic and contemporary dialogues (1st ed., pp. 39–43). New York, NY: Taylor & Francis.
Moore, C. (2016). Action mapping: A visual approach to training design. Retrieved from
http://blog.cathy-moore.com/action-mapping-a-visual-approach-to-training-design/
Morrison, G. R., Ross, S. M., Kemp, J. E., Kalman, H. K., & Kemp, J. E. (2012). Designing
effective instruction (7th ed.). Hoboken, NJ: Wiley.
Newby, T. J., Stepich, D., Lehman, J., & Russell, J. D. (1996). Instructional technology for
teaching and learning: Designing, integrating computers,and using media. Upper Saddle
River, NJ: Pearson Education, Inc.
Reiser, R. A. (2001). A history of instructional design and technology: Part II. Educational
Technology Research and Development, 49(2), 57–67.
Reiser, R. A. (2017). What field did you say you were in? In R. A. Reiser & J. V. Dempsey
(Eds.), Trends and issues in instructional design and technology (4th ed., pp. 1–7). New
York, NY: Pearson Education, Inc.
Richey, R. C., Klein, J. D., & Tracey, M. W. (2010). The instructional design knowledge base:
Theory, research, and practice. New York, NY: Routledge.
Seels, B., & Glasgow, Z. (1997). Making instructional design decisions. Upper Saddle River,
NJ: Prentice-Hall.
Sims, R., & Jones, D. (2002). Continuous improvement through shared understanding:
Reconceptualising instructional design for online learning. In Ascilite Conference: Winds of
Change in the Sea of Learning: Charting the Course of Digital Education (pp. 1–10).
Auckland. Retrieved from
http://www.ascilite.org/conferences/auckland02/proceedings/papers/162.pdf
Smaldino, S., Lowther, D. L., Mims, C., & Russell, J. D. (2015). Instructional technology and
media for learning (11th ed.). Boston, MA: Pearson Education, Inc.
Smith, P. L., & Ragan, T. J. (1993). Instructional design. Princeton, NC: Merrill Publishing
Company.
Smith, P. L., & Ragan, T. J. (2004). Instructional design (3rd ed.). Hoboken, NJ: John Wiley &
Sons, Inc.
                                              285
               Foundations of Learning and Instructional Design Technology
Twelker, P. A., Urbach, F. D., & Buck, J. E. (1972). The systematic development of
instruction: An overview and basic guide to the literature. Stanford, CA: ERIC
Clearinghouse on Educational Media and Technology.
van Merriënboer, J. J. G., & Kirschner, P. A. (2007). Ten steps to complex learning: A
systematic approach to four-component instructional design. Mahwah, NJ: Lawrence
Erlbaum Associates, Publishers.
Van Patten, J. (1989). What is instructional design? In K. A. Johnson & L. J. Foa (Eds.),
Instructional design: New alternatives for effective education and training (pp. 16–31). New
York, NY: Macmillan.
Wiggins, G. P., & McTigue, J. (2000). Understanding by design (1st ed.). Alexandria, VA:
Merrill Education/ACSD College Textbook Series.
Further Resources
       1. Altun, S., & Büyükduman, F. İ. (2007). Teacher and student beliefs on
          constructivist instructional design: A case study. Educational Sciences:
          Theory & Practice, 7(1), 30–39.
       2. Angeli, C., & Valanides, N. (2005). Preservice elementary teachers as
          information and communication technology designers: An instructional
          systems design model based on an expanded view of pedagogical content
          knowledge. Journal of Computer Assisted Learning, 21(4), 292–302.
          http://doi.org/10.1111/j.1365-2729.2005.00135.x
       3. Carr-Chellman, A. A. (2015). Instructional design for teachers: Improving
          classroom practice. New York, NY: Routledge.
       4. Cennamo, K. S. (2003). Design as knowledge construction. Computers in the
          Schools, 20(4), 13–35. http://doi.org/10.1300/J025v20n04_03
                                             286
         Foundations of Learning and Instructional Design Technology
                                      287
           Foundations of Learning and Instructional Design Technology
                                         288
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        289
                          Tonia A. Dousay
                                     290
                                            23
Vanessa Svihla
Abstract
While most instructional design courses and much of the instructional design industry
focuses on ADDIE, approaches such as design thinking, human-centered design, and agile
methods like SAM (Successive Approximation Model)—have drawn attention. This chapter
unpacks what we know about design thinking and presents a concise history of design
thinking to situate it within the broader design research field and then traces its emergence
in other fields. I consider lessons for instructional designers and conclude by raising
concerns for scholarship and teaching—and thereby practice—and set an agenda for
addressing these concerns.
Introduction
Many depictions of design process, and a majority of early design learning experiences,
depict design as rather linear—a “waterfall” view of design (Figure 1). This depiction was
put forward as a flawed model (Royce, 1970), yet it is relatively common. It also contrasts
what researchers have documented as expert design practice.
                                             291
               Foundations of Learning and Instructional Design Technology
      Design thinking seems both useful and cool, but I have to practice a more traditional
      approach like ADDIE or waterfall. Can I integrate agile methods and design thinking
      into my practice?
      Design thinking—particularly the work by IDEO—is inspiring. As an instructional
      designer, can design thinking guide me to create instructional designs that really help
      people?
      Given that design thinking seems to hold such potential for instructional designers, I
      want to do a research study on design thinking. Because it is still so novel, what
      literature should I review?
      As a designer, I sometimes get to the end of the project, and then have a huge insight
      about improvements. Is there a way to shift such insights to earlier in the process so
      that I can take advantage of them?
      If design thinking and agile design methods are so effective, why aren’t we taught to
      do them from beginning?
To answer these questions, I explore how research on design thinking sheds light on
different design methods, considering how these methods originated and focusing on
lessons for instructional designers. I then share a case to illustrate how different design
                                              292
                 Foundations of Learning and Instructional Design Technology
methods might incorporate design thinking. I close by raising concerns and suggesting ways
forward.
Table 1. Characterizations of design thinking (DT) across fields, authors, and over time
                                                           Education        Design
                                       Stanford d.school   researchers      researchers
Design             IDEO president
                                       (2012) & IDEO       characterize     continue to
research field     introduces DT to
                                       (2011) introduce    DT for           develop nuanced
characterizes      the business world,
                                       DT resources for    education        characterizations
DT (1992)          2008
                                       educators           research &       of DT in practice,
                                                           practice, 2012   2013
                                             293
               Foundations of Learning and Instructional Design Technology
“how designers
formulate
problems, how
                   “uses the
they generate
                   designer’s
solutions, and
                   sensibility and       “a mindset.” It is“analytic and
the cognitive
                   methods [empathy,     human-centered,   creative
strategies they
                   integrative           collaborative,    process that
employ.” These                                                               “a methodology to
                   thinking, optimism,   optimistic, and   engages a
include framing                                                              generate
                   experimentalism,      experimental.     person in
the problem,                                                                 innovative ideas.”
                   collaboration] to                       opportunities
oscillating
                   match people’s        The “structured” to experiment,
between                                                                      These include
                   needs with what is    process of design create and
possible                                                                     switching
                   technologically       includes          prototype
solutions and                                                                between design
                   feasible and what a   discovery,        models, gather
reframing the                                                                tasks and working
                   viable business       interpretation,   feedback, and
problem,                                                                     iteratively.
                   strategy can          ideation,         redesign”
imposing                                                                     (Rodgers, 2013, p.
                   convert into          experimentation,
constraints to                                                               434)
                   customer value and    and evolution     (Razzouk &
generate ideas,
                   market                (d.school, 2012;  Shute, 2012, p.
and reasoning
                   opportunity.“         IDEO, 2011)       330)
abductively.
                   (Brown, 2008, p. 2)
(Cross, Dorst, &
Roozenburg,
1992, p. 4)
Additional Reading
    For another great summary of various approaches to design thinking, see this
    article by the Interaction Design Foundation. This foundation has many other
    interesting articles on design that would be good reading for an instructional
    design student.
https://edtechbooks.org/-nh
                                               294
               Foundations of Learning and Instructional Design Technology
we have to treat design as a mysterious, ineffable art” (Cross, 1999, p. 7). By documenting
what accomplished designers do and how they explain their process, design researchers
argued that while scientific thinking can be characterized as reasoning inductively and
deductively, designers reason constructively or abductively (Kolko, 2010). When designers
think abductively, they fill in gaps in knowledge about the problem space and the solution
space, drawing inferences based on their past design work and on what they understand the
problem to be.
Lesson #1 for ID
A critical difference between scientific thinking and design thinking is the treatment of the
problem. Whereas in scientific thinking the problem is treated as solvable through empirical
reasoning, in design thinking problems are tentative, sometimes irrational conjectures to be
dealt with (Diethelm, 2016). This type of thinking has an argumentative grammar, meaning
the designer considers suppositional if-then and what-if scenarios to iteratively frame the
problem and design something that is valuable for others (Dorst, 2011). As designers do this
kind of work, they are jointly framing the problem and posing possible solutions, checking to
see if their solutions satisfy the identified requirements (Cross et al., 1992; Kimbell, 2012).
From this point of view, we don’t really know what the design problem is until it is solved!
And when doing design iteratively, this means we are changing the design problem multiple
times. But how can we manage such changes efficiently? One answer is agile design.
Agile design, with its emphasis on rapid prototyping, testing and iteration, was proposed to
improve software design processes. Later canonized in the Manifesto for Agile Software
Development (Beck et al., 2001), early advocates argued that this paradigm shift in software
design process was urgently needed in “the living human world” that was affected by
“increasingly computer-based systems
while the existing discipline of software engineering has no way of dealing with this
systematically” (Floyd, 1988, p. 25). With the influence new technologies were having on
educational settings, it was natural that instructional designers might look to software
design for inspiration. Indeed, Tripp and Bichelmeyer introduced instructional designers to
rapid prototyping methods while these same methods were still being developed in the
software design field (1990). They explained that traditional ID models were based on
“naive idealizations of how design takes place,” (p. 43), and that ID practice already
included similar approaches (e.g., formative evaluation and prototyping), suggesting that
                                             295
               Foundations of Learning and Instructional Design Technology
agile design could be palatable to instructional designers, particularly when the context or
learning approach is relatively new or unfamiliar.
Lesson #2 for ID
    Our instructional designs tend to be short lived in use, making them subject to
    iteration and adaptation to meet emergent changes. Each new solution is linked to
    a reframing of the problem. As agile designers, we can embrace this iteration
    agentively, reframing the problem as we work based on insights gained from
    testing early, low fidelity prototypes with stakeholders.
As practiced, agile methods, including SAM (Allen, 2012) and user-centered design (Norman
& Draper, 1986), bring the end user into the design process frequently (Fox, Sillito, &
Maurer, 2008). Working contextually and iteratively can help clients see the value of a
proposed design solution and understand better how—and if—it will function as needed
(Tripp & Bichelmeyer, 1990).
Other design methods that engage stakeholders early in the design process, such as
participatory design (Muller & Kuhn, 1993; Schuler & Namioka, 1993) and human-centered
design (Rouse, 1991) have also influenced research on design thinking. While these
approaches differed in original intent, these differences have been blurred as they have
come into practice. Instead of defining each, let’s consider design characteristics made
salient by comparing them with more traditional, linear methods. Like agile design, these
methods tend to be iterative. They also tend to bring stakeholders into the process more
deeply to better understand their experiences, extending the approach taken in ADDIE, or
even to invite stakeholders to generate possible design ideas and help frame the design
problem.
When designing with end-users, we get their perspective and give them more ownership
over the design, but it can be difficult to help them be visionary. As an example, consider
early smartphone design. Early versions had keyboards and very small screens and each
new version was incrementally different from the prior version. If we had asked users what
they wanted, most would have suggested minor changes in line with the kinds of changes
they were seeing with each slightly different version. Likewise, traditional approaches to
instruction should help inspire stakeholder expectations of what is possible in a learning
design.
                                             296
               Foundations of Learning and Instructional Design Technology
Lesson #3 for ID
    Inviting stakeholders into instructional design process early can lead to more
    successful designs, but we should be ready to support them to be visionary, while
    considering how research on how people learn might inform the design.
Designers who engage with end-users must also attend to power dynamics (Kim, Tan, &
Kim, 2012). As instructional designers, when we choose to include learners in the design
process, they may be uncertain about how honest they can be with us. This is especially true
when working with children or adults from marginalized communities or cultures unfamiliar
to us. For instance, an instructional designer who develops a basic computer literacy
training for women fleeing abuse may well want to understand more about learner needs,
but should consider carefully the situations in which learners will feel empowered to share.
Lesson #4 for ID
    With a focus on understanding human need, design thinking and agile methods
    should also draw our attention to inclusivity, diversity, and participant safety.
We next turn to an example, considering what design thinking might look like across
different instructional design practices.
     Waterfall design proceeds in a linear, stepwise fashion, treating the problem as known
     and unchanging
     ADDIE design, in this example, often proceeds in a slow, methodical manner, spending
     time stepwise on each phase
     Agile design proceeds iteratively, using low fidelity, rapid prototyping to get feedback
     from stakeholders early and often
     Human-centered design prioritizes understanding stakeholder experiences, sometimes
     co-designing with stakeholders
                                             297
               Foundations of Learning and Instructional Design Technology
A client—a state agency—issued a call for proposals that addressed a design brief for
instructional materials paired with new approaches to assessment that would be “worth
teaching to.” They provided information on the context, learners, constraints, requirements,
and what they saw as the failings of current practice. They provided evaluation reports
conducted by an external contractor and a list of 10 sources of inspiration from other states.
They reviewed short proposals from 10 instructional design firms. In reviewing these
proposals, they noted that even though all designers had access to the same information and
the same design brief, the solutions were different, yet all were satisficing, meaning they
met the requirements without violating any constraints. They also realized that not only
were there 10 different solutions, there were also 10 different problems being solved! Even
though the client had issued a design brief, each team defined the problem differently.
The client invited four teams to submit long proposals, which needed to include a clear
depiction of the designed solution, budget implications for the agency, and evidence that the
solution would be viable. Members of these teams were given a small budget to be spent as
they chose.
Team Waterfall, feeling confident in having completed earlier design steps during the short
proposal stage, used the funds to begin designing their solution, hoping to create a strong
sense of what they would deliver if chosen. They focused on details noted in the mostly
positive feedback on their short proposal. They felt confident they were creating a solution
that the client would be satisfied with because their design met all identified requirements,
because they used their time efficiently, and because as experienced designers, they knew
they were doing quality, professional design. Team Waterfall treated the problem as
adequately framed and solved it without iteration. Designers often do this when there is
little time or budget[2] [#footnote-1448-2], or simply because the problem appears to be an
another-of problem—“this is just another of something I have designed before.” While this
can be an efficient way to design, it seldom gets at the problem behind the problem, and
does not account for changes in who might need to use the designed solution or what their
needs are. Just because Team Waterfall used a more linear process does not mean that they
did not engage in design thinking. They used design thinking to frame the problem in their
initial short proposal, and then again as they used design precedent—their past experience
solving similar problems—to deliver a professional, timely, and complete solution.
Team ADDIE used the funds to conduct a traditional needs assessment, interviewing five
stakeholders to better understand the context, and then collecting data with a survey they
created based on their analysis. They identified specific needs, some of which aligned to
those in the design brief and some that demonstrated the complexity of the problem. They
reframed the problem and created a low fidelity prototype. They did not have time to test it
with stakeholders, but could explain how it met the identified needs. They felt confident the
investment in understanding needs would pay off later, because it gave them insight into the
problem. Team ADDIE used design thinking to fill gaps in their understanding of context,
allowing them to extend their design conjectures to propose a solution based on a reframing
                                             298
               Foundations of Learning and Instructional Design Technology
Team Agile used the budget to visit three different sites overseen by the state agency. They
shared a low fidelity prototype with multiple stakeholders at the first site. In doing so, they
realized they had misunderstood key aspects of the problem from one small but critical
stakeholder group. They revised both their framing of the problem and their idea about the
solution significantly and shared a revised prototype with stakeholders at the remaining
sites. They submitted documentation of this process with their revised prototype. Team
Agile prioritized iteration and diversity of point of view in their work. They committed to
treating their solution ideas as highly tentative, but gave stakeholders something new and
different to react to. This strategy helped the team reframe the problem, but could have
failed had they only sought feedback on improvements, rather than further understanding of
the problem. They used design thinking to reframe their understanding of the problem, and
this led them to iterate on their solution. Design researchers describe this as a co-
evolutionary process, in which changes to the problem framing affect the solution, and
changes to the solution affect the framing (Dorst & Cross, 2001).
Team Human-centered used the budget to hold an intensive five-day co-design session with
a major stakeholder group. Stakeholders shared their experiences and ideas for improving
on their experience. Team Human crafted three personas based on this information and
created a prototype, which the stakeholder group reviewed favorably. They submitted this
review with their prototype. Team Human-centered valued stakeholder point of view above
all else, but failed to consider that an intensive five-day workshop would limit who could
attend. They used design thinking to understand differences in stakeholder point of view
and reframed the problem based on this; however, they treated this as covering the territory
of stakeholder perspectives. They learned a great deal about the experiences these
stakeholders had, but failed to help the stakeholders think beyond their own experiences,
resulting in a design that was only incrementally better than existing solutions and catered
to the desires of one group over others.
The case above depicts ways of proceeding in design process and different ways of using
design thinking. These characterizations are not intended to privilege one design approach
over others, but rather to provoke the reader to consider them in terms of how designers fill
in gaps in understanding, how they involve stakeholders, and how iteratively they work.
Each approach, however, also carries potential risks and challenges (Figure 2). For
instance, designers may not have easy access to stakeholders, and large projects may make
agile approaches unwieldy to carry out (Turk, France, & Rumpe, 2002).
                                             299
               Foundations of Learning and Instructional Design Technology
These critiques should make us cautious about how we, as instructional designers, take up
design thinking and new design practices. Below, I raise a few concerns for new
instructional designers, for instructional designers interested in incorporating new methods,
                                              300
                Foundations of Learning and Instructional Design Technology
for those who teach instructional design, and for those planning research studies about new
design methods.
My first concern builds directly on critiques from the popular press and my experience as a
reviewer of manuscripts. Design thinking is indeed trendy, and of course people want to
engage with it. But as we have seen, it is also complex and subtle. Whenever we engage
with a new topic, we necessarily build on our past understandings and beliefs as we make
connections. It should not be surprising, then, that when our understanding of a new
concept is nascent, it might not be very differentiated from previous ideas. Compare, for
example, Polya’s “How to Solve it” from 1945 to Stanford’s d.school representation of
design thinking (Table 2). While Polya did not detail a design process, but rather a process
for solving mathematics problems, the two processes are superficially very similar. These
general models of complex, detailed processes are zoomed out to such a degree that we lose
the detail. These details matter, whether you are a designer learning a new practice or a
researcher studying how designers do their work. For those learning a new practice, I
advise you to attend to the differences, not the similarities. For those planning studies of
design thinking, keep in mind that “design thinking” is too broad to study effectively as a
whole. Narrow your scope and zoom in to a focal length that lets you investigate the details.
As you do so, however, do not lose sight of how the details function in a complex process.
For instance, consider the various approaches being investigated to measure design
thinking; some treat these as discrete, separable skills, and others consider them in tandem
(Carmel-Gilfilen & Portillo, 2010; Dolata, Uebernickel, & Schwabe, 2017; Lande, Sonalkar,
Jung, Han, & Banerjee, 2012; Razzouk & Shute, 2012).
Table 2. Similarities between “How to Solve it” and a representation of design thinking
My second concern is that we tend, as a field, to remain naïve about the extant and
extensive research on design thinking and other design methods, in part because many of
these studies were conducted in other design fields (e.g., architecture, engineering) and
published in journals such as Design Studies (which has seldom referenced instructional
design). Not attending to past and current research, and instead receiving information
about alternative design methods filtered through other sources is akin to the game of
telephone. By the time the message reaches us, it can be distorted. While we need to adapt
alternative methods to our own ID practices and contexts, we should do more to learn from
other design fields, and also contribute our findings to the design research field. As
designers, we would do well to learn from fields that concern themselves with human
experience and focus somewhat less on efficiency.
                                            301
               Foundations of Learning and Instructional Design Technology
My third concern is about teaching alternative design methods to novice designers. The
experience of learning ID is often just a single pass, with no or few opportunities to iterate.
As a result, agile methods may seem the perfect way to begin learning to design, because
there is no conflicting traditional foundation to overcome. However, novice designers tend
to jump to solutions too quickly, a condition no doubt brought about in part by an emphasis
in schooling on getting to the right answer using the most efficient method. Methods like
agile design encourage designers to come to a tentative solution right away, then get
feedback by testing low fidelity prototypes. This approach could exacerbate a new
designer’s tendency to leap to solutions. And once a solution is found, it can be hard to give
alternatives serious thought. Yet, I argue that the solution is not to ignore agile and human-
centered methods in early instruction. By focusing only on ADDIE, we may create a different
problem by signaling to new designers that the ID process is linear and tidy, when this is
typically not the case.
Instead, if we consider ADDIE as a scaffold for designers, we can see that its clarity makes it
a useful set of supports for those new to design. Alternative methods seldom offer such
clarity, and have far fewer resources available, making it challenging to find the needed
supports. To resolve this, we need more and better scaffolds that support novice designers
to engage in agile, human-centered work. For instance, I developed a Wrong Theory Design
Protocol (https://edtechbooks.org/-ub) that helps inexperienced designers get unstuck,
consider the problem from different points of view, and consider new solutions. Such
scaffolds could lead to a new generation of instructional designers who are better prepared
to tackle complex learning designs, who value the process of framing problems with
stakeholders, and who consider issues of power, inclusivity, and diversity in their designing.
Concluding Thoughts
I encourage novice instructional designers, as they ponder the various ID models,
approaches, practices and methods available to them, to be suspicious of any that render
design work tidy and linear. If, in the midst of designing, you feel muddy and uncertain,
unsure how to proceed, you are likely exactly where you ought to be.
In such situations, we use design thinking to fill in gaps in our understanding of the problem
and to consider how our solution ideas might satisfy design requirements. While
experienced designers have an expansive set of precedents to work with in filling these
gaps, novice designers need to look more assiduously for such inspiration. Our past
educational experiences may covertly convince us that just because something is common, it
is best. While a traditional instructional approach may be effective for some learners, I
encourage novice designers to consider the following questions to scaffold their evaluation
of instructional designs:
                                             302
                Foundations of Learning and Instructional Design Technology
      mother, child, or next-door neighbor want to be? If yes on all counts, consider who
      wouldn’t, and why they wouldn’t.
      Is the design, as one of my favorite project-based teachers used to ask, “provocative”
      for the learners, meaning, will it provoke a strong response, a curiosity, and a desire
      to know more?
      Is the design “chocolate-covered broccoli” that tricks learners into engaging?
To be clear, the goal is not to make all learning experiences fun or easy, but to make them
worthwhile. And I can think of no better way to ensure this than using iterative, human-
centered methods that help designers understand and value multiple stakeholder
perspectives. And if, in the midst of seeking, analyzing, and integrating such points of view,
you find yourself thinking, “This is difficult,” that is because it is difficult. Providing a low
fidelity prototype for stakeholders to react to can make this process clearer and easier to
manage, because it narrows the focus.
However, success of this approach depends on several factors. First, it helps to have
forthright stakeholders who are at least a little hard to please. Second, if the design is
visionary compared to the current state, stakeholders may need to be coaxed to envision
new learning situations to react effectively. Third, designers need to resist the temptation to
settle on an early design idea.
Figure 3. Designers need to resist the temptation to settle on an early design idea
                                               303
               Foundations of Learning and Instructional Design Technology
References
Adams, R. S., Daly, S. R., Mann, L. M., & Dall’Alba, G. (2011). Being a professional: Three
lenses into design thinking, acting, and being. Design Studies, 32(6), 588-607.
doi:10.1016/j.destud.2011.07.004
Allen, M. (2012). Leaving ADDIE for SAM: An agile model for developing the best learning
experiences: American Society for Training and Development.
Beck, K., Beedle, M., Bennekum, A. V., Cockburn, A., Cunningham, W., Fowler, M., . . .
Thomas, D. (2001). Manifesto for agile software development. Retrieved from
http://agilemanifesto.org/
Buchanan, R. (1992). Wicked problems in design thinking. Design Issues, 8(2), 5-21.
doi:10.2307/1511637
Collier, A. (2017). Surprising insights, outliers, and privilege in design thinking. Retrieved
from https://edtechbooks.org/-ie
Cross, N. (1999). Design research: A disciplined conversation. Design Issues, 15(2), 5-10.
doi:10.2307/1511837
Cross, N., Dorst, K., & Roozenburg, N. F. M. (Eds.). (1992). Research in design thinking:
Delft University Press.
Diethelm, J. (2016). De-colonizing design thinking. She Ji: The Journal of Design, Economics,
and Innovation, 2(2), 166-172. doi:https://edtechbooks.org/-zY
Dolata, M., Uebernickel, F., & Schwabe, G. (2017). The power of words: Towards a
methodology for progress monitoring in design thinking projects. Proceedings of the 13th
International Conference on Wirtschaftsinformatik. St. Gallen, Switzerland.
Dorst, K. (2011). The core of ‘design thinking’ and its application. Design Studies, 32(6),
521-532. doi:10.1016/j.destud.2011.07.006
                                              304
               Foundations of Learning and Instructional Design Technology
Dorst, K., & Cross, N. (2001). Creativity in the design process: Co-evolution of problem-
solution. Design Studies, 22(5), 425-437. doi:https://edtechbooks.org/-NX
Fox, D., Sillito, J., & Maurer, F. (2008). Agile methods and user-centered design: How these
two methodologies are being successfully integrated in industry. Agile, 2008. (pp. 63-72):
IEEE.
Kim, B., Tan, L., & Kim, M. S. (2012). Learners as informants of educational game design. In
J. van Aalst, K. Thompson, M. J. Jacobson, & P. Reimann (Eds.), The future of learning:
Proceedings of the 10th International Conference of the Learning Sciences (Vol. 2, pp.
401-405). Sydney, Australia: ISLS.
Kimbell, L. (2011). Rethinking design thinking: Part I. Design and Culture, 3(3), 285-306.
doi:https://edtechbooks.org/-cC
Kimbell, L. (2012). Rethinking design thinking: Part II. Design and Culture, 4(2), 129-148.
doi:https://edtechbooks.org/-ef
Kolko, J. (2010). Abductive thinking and sensemaking: The drivers of design synthesis.
Design Issues, 26(1), 15-28. doi:10.1162/desi.2010.26.1.15
Lande, M., Sonalkar, N., Jung, M., Han, C., & Banerjee, S. (2012). Monitoring design
thinking through in-situ interventions. Design thinking research (pp. 211-226). Berlin:
Springer.
Merholz, P. (2009). Why design thinking won’t save you. Harvard Business Review, 09-09.
Retrieved from https://edtechbooks.org/-tR
Muller, M. J., & Kuhn, S. (1993). Participatory design. Communications of the ACM, 36(6),
24-28. doi:10.1145/153571.255960
Norman, D. A., & Draper, S. W. (1986). User centered system design. Hillsdale, NJ: CRC
Press.
Razzouk, R., & Shute, V. (2012). What is design thinking and why is it important? Review of
Educational Research, 82(3), 330-348. doi:https://edtechbooks.org/-yk
                                             305
               Foundations of Learning and Instructional Design Technology
Schuler, D., & Namioka, A. (1993). Participatory design: Principles and practices. Hillsdale,
NJ: Lawrence Erlbaum Associates.
Turk, D., France, R., & Rumpe, B. (2002). Limitations of agile software processes.
Proceedings of the Third International Conference on Extreme Programming and Flexible
Processes in Software Engineering (pp. 43-46).
Wylant, B. (2008). Design thinking and the experience of innovation. Design Issues, 24(2),
3-14. doi:10.1162/desi.2008.24.2.3
                                             306
            Foundations of Learning and Instructional Design Technology
1. For those interested in learning more, refer to the journal, Design Studies, and the
   professional organization, Design Research Society. Note that this is not a reference
   to educational researchers who do design-based research. ↵ [#return-
                                          307
            Foundations of Learning and Instructional Design Technology
   footnote-1448-1]
2. Waterfall might also be used when designing a large, expensive system that cannot be
   tested and iterated on as a whole and when subsystems cannot easily or effectively be
   prototyped. ↵ [#return-footnote-1448-2]
Suggested Citation
 Svihla, V. (2018). Design Thinking and Agile Design: New Trends or Just Good
 Designs?. In R. E. West, Foundations of Learning and Instructional Design
 Technology: The Past, Present, and Future of Learning and Instructional Design
 Technology. EdTech Books. Retrieved from
 https://edtechbooks.org/lidtfoundations/design_thinking_and_agile_design
                                         308
                           Vanessa Svihla
                                     309
                                             24
Andrew Gibbons
Editor’s Note
A question I always ask my Instructional Technology students at Utah State University is,
“What do instructional designers design?” We have had interesting discussions on this
question, and I try to revisit the question at several points throughout all of my classes. I
find that the students’ perceptions of what instructional designers design changes over
time. This is no doubt a product of the faculty’s teaching, but it also represents a personal
commitment that the student makes. What the student commits to is what I would like to
talk about. My thesis will be that it is a commitment to a particular layer of the evolving
instructional design. I will talk about the layering of instructional designs and the
implications for both teaching and practicing instructional design.
The Centrisms
Here are some of the phases I see students evolving through as they mature in their
theoretic and practical knowledge:
                                              310
               Foundations of Learning and Instructional Design Technology
medium as a plastic and preferably invisible channel for learning interaction (See Norman,
1988; 1999). We are currently experiencing a wave of new media-centric designers due to
the accessibility of powerful multimedia tools and large numbers of designers “assigned
into” computer-based and Web-based training design. Most of these designers speak in
terms of the medium’s constructs (the “page,” the “hyperlink,”, the “site,” etc.) as the major
design building blocks. Many struggle as they attempt to apply inadequate thought tools to
complex design problems.
Message-centrism. Realizing that media design building blocks do not automatically lead
to effective designs, most designers begin to concentrate on “telling the message better” in
order to “get the idea across” or “make it stick.” This is a phase I call message-centrism.
Message-centric design places primary importance on message-related constructs—main
idea, explanation, line of argument, dramatization, etc.— and employs media constructs
secondarily, according to the demands of the message. The media constructs are used, but
they are used to serve the needs of better messaging. Better message telling means
different things to different designers: providing better illustrations, using animations,
wording the message differently, using analogies, or focusing learner attention using
attention-focusing questions, emphasis marks, repetition, or increased interactivity.
                                             311
               Foundations of Learning and Instructional Design Technology
augmentations that support problem solving in the form of coaching and feedback systems,
representation systems, control systems, scope dynamics, and embedded didactics (see
Gibbons, Fairweather, Anderson, & Merrill, 1997).
These phases in the maturation of design thinking tend to be encountered by new designers
in the same order, and one could make the argument that these phases describe the history
of research interests in the field of instructional technology as a whole. A good place to see
this trend in cross-section is the articles in the Annual Review of Psychology beginning with
the review by Lumsdaine and May (1965) and progressing through subsequent chapters by
Anderson (1967); Gagne & Rohwer (1969); Glaser & Resnick (1972); McKeachie (1974);
Wittrock & Lumsdaine (1977); Resnick (1981); Gagne & Dick (1983); Pintrich, Cross, Kozma
& McKeachie (1986); Snow & Swanson (1992); Voss, Wiley & Carretero (1995); Sandoval
(1995); VanLehn (1996); Carroll (1997); Palincsar (1998); and Medin, Lynch & Solomon
(2000).
I am interested in this paper in exploring the roots of this progression. Important clues can
be found in design areas outside of instructional design. A provocative statement on design
structure is given by Brand (1994) in a description of how buildings are seen by architects
and structural engineers. Brand begins by stating that architects see a building as a system
of layers rather than as a unitary designed entity. He names six general layers, illustrated in
Figure 1 and described below in his own words:
      SITE – This is the geographical setting, the urban location, and the legally defined lot,
      whose boundaries and context outlast generations of ephemeral buildings. “Site is
                                             312
                 Foundations of Learning and Instructional Design Technology
Brand points out some important implications of the layered view of design:
In work for the Center for Human-Systems Simulation, my colleagues Jon Nelson and Bob
Richards and I have applied Brand’s ideas to instructional design (Gibbons, Nelson &
Richards, 2000). We have found that instructional designs can indeed be conceived of as
multiple layers of decision making with respect to different sets of design constructs, and
we find a rough correspondence between the layers and the phases of designer thinking
already described. Gibbons, Lawless, Anderson and Duffin (2001) show how layers of a
design are compressed at a “convergence zone” with tool constructs that give them real
existence and embody them in a product.
Tables 1 through 7 following this article, summarize what we think are the important layers
of an instructional design: model/content, strategy, control, message, representation, media-
logic, and management. Each layer is characterized in the tables by the following sets:
                                                313
               Foundations of Learning and Instructional Design Technology
In addition, a layer often corresponds with a set of specialized design skills with its own
lore, design heuristics, technical data, measurements, algorithms, and practical
considerations. The boundaries of these skills over time tend to harden into lines of labor
division, especially as technical sophistication of tools and techniques increases.
More detailed principles of design layering are outlined in Gibbons, Nelson, and Richards
(2000). The purpose of the present paper is to show how design layering influences the
designer’s thinking and allows it to change over time into entirely new ways of approaching
the design task. The media, message, strategy, and model-centric phases designers
experience can be explained as the necessary focus of the designer first and foremost on a
particular layer of the design. That is, the designer enters the design at the layer most
important to the design or with which the designer is most familiar and comfortable.
Media-centric designers do not ignore decisions related to other layers, but because they
may not yet be fully acquainted with the principles of design at other layers, they naturally
think in terms of the structures they do know or can acquire most rapidly—media
structures. As designers become aware of principles at other layers through experience and
the evaluation of their own designs, focus can shift to the constructs of the different layers:
message structurings, strategy structurings, and model and content structurings. Each step
of the progression in turn gives the designer a new set of constructs and structuring
principles to which to give the most attention, with other layers of the design being
determined secondarily, but not ignored.
Is there a “right” layer priority in designs? Should designers always be counseled to enter
the design task with a particular layer in mind? It is not possible to say, because design
tasks most often come with constraints attached, and one of those constraints may
predetermine a primary focus on a layer. An assignment to create a set of videotapes will
lead the designer to pay first and last attention to the media-logic and representation layers,
and other layers are forced to comply with the constraint within the limits of the designer’s
ingenuity.
Conclusion
The design layering concept has many implications. In this paper I have explored one of
them that explains the maturation in designer thinking over time. In order to move to a new
perspective of design it is not necessary to leave older views behind. The new principles
added as the designer becomes knowledgeable about each new layer adds to the designer’s
range and to the sophistication of the designs that are possible. Further consideration of the
layering concept will expand our ability to communicate designs in richer detail, achieve
more sophisticated designs, and add to our understanding of the design process itself.
                                              314
               Foundations of Learning and Instructional Design Technology
                                                                 Model
                                                                 Relation
To define the units of content segmentation                      Production rule
                                                                 Working Memory
To define the method of content capture                          Element
To gather content elements                                       Proposition
To articulate content structures:                                Fact
With the Strategy layer                                          Concept
With the Control layer                                           Rule
With the Message layer                                           Principle
With the Representation layer                                    Task
With the Logic layer                                             Task grouping
With the Management layer                                        Theme
                                                                 Topic
                                                                 Main idea
                                                                 Semantic relationship
                                                                 Chapter
Design Processes: Task Analysis, Cognitive Task Analysis, Rule Analysis, Content
Analysis, Concept Mapping
Design/Production Tools: Data base software, Analysis software
                                              315
               Foundations of Learning and Instructional Design Technology
                                                316
               Foundations of Learning and Instructional Design Technology
                                       Main idea
                                       Example
                                       Non-Example
                                       Discussion block
                                       Commentary
To define message types                Advance organizer
                                       Primitive message element
To define message composition by type Spatial relationship
To define rules for message generation Temporal relationship
To articulate message structures:      Causal relationship
With the Content layer                 Hierarchical relationship
With the Strategy layer                Explanation
With the Control layer                 Stem
With the Representation layer          Distractor
With the Logic layer                   Response request
With the Management layer              Transition message
                                       Goal statement
                                       Directions
                                       “Resource”
                                       Database entry
                                       Coaching message
                                       Feedback message
                                       Hint
                                            317
               Foundations of Learning and Instructional Design Technology
To select media
Design Processes: Display design, Formatting, Display event sequencing, Media channel
synchronization, Media channel assignment
Design/Production Tools: All content/resource production tools for all media, All layout
or formatting tools for all media, Display managers
                                              318
               Foundations of Learning and Instructional Design Technology
Application Exercises
References
Anderson, J. R. (1993). Rules of the Mind. Hillsdale, NJ: Lawrence Erlbaum Associates.
Brand, S. (1994). How Buildings Learn: What Happens After They’re Built. New York:
Penguin.
                                             319
               Foundations of Learning and Instructional Design Technology
Gagne, R. M. (1985). The Conditions of Learning (4th ed.). New York: Holt Rinehart &
Winston.
Gibbons, A. S., Lawless, K., Anderson, T. A. & Duffin, J. (2001). The Web and Model-
Centered Instruction. In B. Khan (Ed.), Web-Based Training. Englewood Cliffs, NJ:
Educational Technology Publications.
Gibbons, A. S., Fairweather, P. G., Anderson, T. A. & Merrill, M. D. (1997). Simulation and
Computer-Based Instruction: A Future View. In C. R. Dills and A. J. Romiszowski (Eds.),
Instructional Development Paradigms.Englewood Cliffs, NJ: Educational Technology
Publications.
Lumsdaine, A. A.& May, M. A. (1965). Mass Communication and Educational Media. Annual
Review of Psychology,17:475-534.
Medin, D. L., Lynch, E. B., & Solomon, K. O. (2000). Are There Kinds of Concepts? Annual
Review of Psychology,47:513-539.
                                            320
               Foundations of Learning and Instructional Design Technology
Norman, D. A. (1988). The Psychology of Everyday Things. New York: Basic Books.
VanLehn, K. (1993). Problem Solving and Cognitive Skill Acquisition. In M. I. Posner (Ed.),
Foundations of Cognitive Science. Cambridge, MA: MIT Press.
VanLehn, K. (1996). Cognitive Skill Acquisition. Annual Review of Psychology, 47: 513-539.
Voss, J. F., Wiley, J. & Carretero, M. (1995). Acquiring Intellectual Skills. Annual Review of
Psychology, 46:155-181.
Zhang, J., Gibbons, A. S. & Merrill, M. D. (1997). Automating the Design of Adaptive and
Self-Improving Instruction. In C. R. Dills and A. J. Romiszowski (Eds.), Instructional
Development Paradigms. Englewood Cliffs, NJ: Educational Technology Publications.
                                              321
          Foundations of Learning and Instructional Design Technology
Further Resources
For more information on Andrew Gibbons’ theory of design layers, see the
following resources:
                                       322
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                       323
                           Andrew Gibbons
                                       324
                                           25
Editor’s Note
                                            325
               Foundations of Learning and Instructional Design Technology
Design research had existed in primitive form—as market research and process
analysis—since before the turn of the 20th century, and, although it had served to improve
processes and marketing, it had not been applied as scientific research. John Chris Jones,
Bruce Archer, and Herbert Simon were among the first to shift the focus from research for
design (e.g., research with the intent of gathering data to support product development) to
research ondesign (e.g., research exploring the design process). Their efforts framed the
initial development of design research and science.
An engineer, Jones (1970) felt that the design process was ambiguous and often too
abstruse to discuss effectively. One solution, he offered, was to define and discuss design in
terms of methods. By identifying and discussing design methods, researchers would be able
to create transparency in the design process, combating perceptions of design being more
or less mysteriously inspired. This discussion of design methods, Jones proposed, would in
turn raise the level of discourse and practice in design.
Bruce Archer
Archer, also an engineer, worked with Jones and likewise supported the adoption of
research methods from other disciplines. Archer (1965) proposed that applying systematic
methods would improve the assessment of design problems and foster the development of
effective solutions. Archer recognized, however, that improved practice alone would not
enable design to achieve disciplinary status. In order to become a discipline, design
required a theoretical foundation to support its practice. Archer (1981) advocated that
design research was the primary means by which theoretical knowledge could be developed.
He suggested that the application of systematic inquiry, such as existed in engineering,
would yield knowledge about not only product and practice, but also the theory that guided
each.
Herbert Simon
It was multidisciplinary social scientist Simon, however, that issued the clarion call for
transforming design into design science (Buchanan, 2007; Collins, 1992; Collins, Joseph, &
Bielaczyc, 2004; Cross, 1999; Cross, 2007; Friedman, 2003; Jonas, 2007; Willemien, 2009).
In The Sciences of the Artificial, Simon (1969) reasoned that the rigorous inquiry and
discussion surrounding naturally occurring processes and phenomena was just as necessary
for man-made products and processes. He particularly called for “[bodies] of intellectually
tough, analytic, partly formalizable, partly empirical, teachable doctrine about the design
process” (p. 132). This call for more scholarly discussion and practice resonated with
designers across disciplines in design and engineering (Buchanan, 2007; Cross, 1999; Cross,
2007; Friedman, 2003; Jonas, 2007; Willemien, 2009). IDR sprang directly from this early
movement and has continued to gain momentum, producing an interdisciplinary body of
research encompassing research efforts in engineering, design, and technology.
                                             326
                Foundations of Learning and Instructional Design Technology
Years later, in the 1980s, Simon’s work inspired the first DBR efforts in education (Collins et
al., 2004). Much of the DBR literature attributes its beginnings to the work of Ann Brown
and Allan Collins (Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003; Collins et al., 2004;
Kelly, 2003; McCandliss, Kalchman, & Bryant, 2003; Oh & Reeves, 2010; Reeves, 2006;
Shavelson, Phillips, Towne, & Feuer, 2003; Tabak, 2004; van den Akker, 1999). Their work,
focusing on research and development in authentic contexts, drew heavily on research
approaches and development practices in the design sciences, including the work of early
design researchers such as Simon (Brown, 1992; Collins, 1992; Collins et al., 2004).
However, over generations of research, this connection has been all but forgotten, and DBR,
although similarly inspired by the early efforts of Simon, Archer, and Jones, has developed
into an isolated and discipline-specific body of design research, independent from its
interdisciplinary cousin.
Proliferation of Terminology
One of the most challenging characteristics of DBR is the quantity and use of terms that
identify DBR in the research literature. There are seven common terms typically associated
with DBR: design experiments, design research, design-based research, formative research,
development research, developmental research, and design-based implementation research.
Synonymous Terms
Collins and Brown first termed their efforts design experiments (Brown, 1992; Collins,
1992). Subsequent literature stemming from or relating to Collins’ and Brown’s work used
                                               327
               Foundations of Learning and Instructional Design Technology
design research and design experiments synonymously (Anderson & Shattuck, 2012; Collins
et al., 2004). Design-based research was introduced to distinguish DBR from other research
approaches. Sandoval and Bell (2004) best summarized this as follows:
     We have settled on the term design-based research over the other commonly
     used phrases “design experimentation,” which connotes a specific form of
     controlled experimentation that does not capture the breadth of the approach,
     or “design research,” which is too easily confused with research design and
     other efforts in design fields that lack in situ research components. (p. 199)
Variations by Discipline
                                            328
               Foundations of Learning and Instructional Design Technology
Lack of Definition
This variation across disciplines, with design researchers tailoring design research to
address discipline-specific interests and needs, has created a lack of definition in the field
overall. In addition, in the literature, DBR has been conceptualized at various levels of
granularity. Here, we will discuss three existing approaches to defining DBR: (a) statements
of the overarching purpose, (b) lists of defining characteristics, and (c) models of the steps
                                             329
               Foundations of Learning and Instructional Design Technology
or processes involved.
General Purpose
In literature, scholars and researchers have made multiple attempts to isolate the general
purpose of design research in education, with each offering a different insight and
definition. According to van den Akker (1999), design research is distinguished from other
research efforts by its simultaneous commitment to (a) developing a body of design
principles and methods that are based in theory and validated by research and (b) offering
direct contributions to practice. This position was supported by Sandoval and Bell (2004),
who suggested that the general purpose of DBR was to address the “tension between the
desire for locally usable knowledge, on the one hand, and scientifically sound, generalizable
knowledge on the other” (p. 199). Cobb et al. (2003) particularly promoted the theory-
building focus, asserting “design experiments are conducted to develop theories, not merely
to empirically tune ‘what works’” (p. 10). Shavelson et al. (2003) recognized the importance
of developing theory but emphasized that the testing and building of instructional products
was an equal focus of design research rather than the means to a theoretical end.
The aggregate of these definitions suggests that the purpose of DBR involves theoretical
and practical design principles and active engagement in the design process. However, DBR
continues to vary in its prioritization of these components, with some focusing largely on
theory, others emphasizing practice or product, and many examining neither but all using
the same terms.
Specific Characteristics
Another way to define DBR is by identifying the key characteristics that both unite and
define the approach. Unlike other research approaches, DBR can take the form of multiple
research methodologies, both qualitative and quantitative, and thus cannot be recognized
strictly by its methods. Identifying characteristics, therefore, concern the research process,
context, and focus. This section will discuss the original characteristics of DBR, as
introduced by Brown and Collins, and then identify the seven most common characteristics
suggested by DBR literature overall.
Brown’s concept of DBR. Brown (1992) defined design research as having five primary
characteristics that distinguished it from typical design or research processes. First, a
design is engineered in an authentic, working environment. Second, the development of
research and the design are influenced by a specific set of inputs: classroom environment,
teachers and students as researchers, curriculum, and technology. Third, the design and
development process includes multiple cycles of testing, revision, and further testing.
Fourth, the design research process produces an assessment of the design’s quality as well
as the effectiveness of both the design and its theoretical underpinnings. Finally, the overall
process should make contributions to existing learning theory.
Collins’s concept of DBR. Collins (1990, 1992) posed a similar list of design research
                                             330
               Foundations of Learning and Instructional Design Technology
Current DBR characteristics. The DBR literature that followed expanded, clarified, and
revised the design research characteristics identified by Brown and Collins. The range of
DBR characteristics discussed in the field currently is broad but can be distilled to seven
most frequently referenced identifying characteristics of DBR: design driven, situated,
iterative, collaborative, theory building, practical, and productive.
Design driven. All literature identifies DBR as focusing on the evolution of a design
(Anderson & Shattuck, 2012; Brown, 1992; Cobb et al., 2003; Collins, 1992; Design-Based
Research Collective, 2003). While the design can range from an instructional artifact to an
intervention, engagement in the design process is what yields the experience, data, and
insight necessary for inquiry.
Situated. Recalling Brown’s (1992) call for more authentic research contexts, nearly all
definitions of DBR situate the aforementioned design process in a real-world context, such
as a classroom (Anderson & Shattuck, 2012; Barab & Squire, 2004; Cobb et al., 2003).
Iterative. Literature also appears to agree that a DBR process does not consist of a linear
design process, but rather multiple cycles of design, testing, and revision (Anderson &
Shattuck, 2012; Barab & Squire, 2004; Brown, 1992; Design-Based Research Collective,
2003; Shavelson et al., 2003). These iterations must also represent systematic adjustment of
the design, with each adjustment and subsequent testing serving as a miniature experiment
(Barab & Squire, 2004; Collins, 1992).
Collaborative. While the literature may not always agree on the roles and responsibilities
of those engaged in DBR, collaboration between researchers, designers, and educators
appears to be key (Anderson & Shattuck, 2012; Barab & Squire, 2004; McCandliss et al.,
2003). Each collaborator enters the project with a unique perspective and, as each engages
in research, forms a role-specific view of phenomena. These perspectives can then be
combined to create a more holistic view of the design process, its context, and the
developing product.
Theory building. Design research focuses on more than creating an effective design; DBR
should produce an intimate understanding of both design and theory (Anderson & Shattuck,
2012; Barab & Squire, 2004; Brown, 1992; Cobb et al., 2003; Design-Based Research
Collective, 2003; Joseph, 2004; Shavelson et al., 2003). According to Barab & Squire (2004),
                                             331
               Foundations of Learning and Instructional Design Technology
“Design-based research requires more than simply showing a particular design works but
demands that the researcher . . . generate evidence-based claims about learning that
address contemporary theoretical issues and further the theoretical knowledge of the field”
(p. 6). DBR needs to build and test theory, yielding findings that can be generalized to both
local and broad theory (Hoadley, 2004).
Practical. While theoretical contributions are essential to DBR, the results of DBR studies
“must do real work” (Cobb et al., 2003, p. 10) and inform instructional, research, and design
practice (Anderson & Shattuck, 2012; Barab & Squire, 2004; Design-Based Research
Collective, 2003; McCandliss et al., 2003).
Productive. Not only should design research produce theoretical and practical insights, but
also the design itself must produce results, measuring its success in terms of how well the
design meets its intended outcomes (Barab & Squire, 2004; Design-Based Research
Collective, 2003; Joseph, 2004; McCandliss et al., 2003).
The third way DBR could possibly be defined is to identify the steps or processes involved in
implementing it. The sections below illustrate the steps outlined by Collins (1990) and
Brown (1992) as well as models by Bannan-Ritland (2003), Reeves (2006), and an aggregate
model presented by Anderson & Shattuck (2012).
Collins’s design experimentation steps. In his technical report, Collins (1990) presented
an extensive list of 10 steps in design experimentation (Figure 2). While Collins’s model
provides a guide for experimentally testing and developing new instructional programs, it
does not include multiple iterative stages or any evaluation of the final product. Because
Collins was interested primarily in development, research was not given much attention in
his model.
Brown’s design research example. The example of design research Brown (1992)
included in her article was limited and less clearly delineated than Collins’s model (Figure
2). Brown focused on the development of educational interventions, including additional
testing with minority populations. Similar to Collins, Brown also omitted any summative
evaluation of intervention quality or effectiveness and did not specify the role of research
through the design process.
                                              332
               Foundations of Learning and Instructional Design Technology
Anderson and Shattuck’s aggregate model. Anderson and Shattuck (2012) reviewed
design-based research abstracts over the past decade and, from their review, presented an
eight-step aggregate model of DBR (Figure 2). As an aggregate of DBR approaches, this
model was their attempt to unify approaches across DBR literature, and includes similar
steps to Reeves’s model. However, unlike Reeves, Anderson and Shattuck did not include
summative reflection and insight development.
                                            333
               Foundations of Learning and Instructional Design Technology
The third challenge facing DBR is the variety of roles researchers are expected to fulfill,
with researchers often acting simultaneously as project managers, designers, and
evaluators. However, with most individuals able to focus on only one task at a time, these
competing demands on resources and researcher attention and faculties can be challenging
to balance, and excess focus on one role can easily jeopardize others. The literature has
recognized four major roles that a DBR professional must perform simultaneously:
researcher, project manager, theorist, and designer.
                                            334
               Foundations of Learning and Instructional Design Technology
Researcher as Researcher
Planning and carrying out research is already comprised of multiple considerations, such as
controlling variables and limiting bias. The nature of DBR, with its collaboration and
situated experimentation and development, innately intensifies some of these issues
(Hoadley, 2004). While simultaneously designing the intervention, a design-based
researcher must also ensure that high-quality research is accomplished, per typical
standards of quality associated with quantitative or qualitative methods.
However, research is even more difficult in DBR because the nature of the method leads to
several challenges. First, it can be difficult to control the many variables at play in authentic
contexts (Collins et al., 2004). Many researchers may feel torn between being able to (a)
isolate critical variables or (b) study the comprehensive, complex nature of the design
experience (van den Akker, 1999). Second, because many DBR studies are qualitative, they
produce large amounts of data, resulting in demanding data collection and analysis (Collins
et al., 2004). Third, according to Anderson and Shattuck (2012), the combination of
demanding data analysis and highly invested roles of the researchers leaves DBR
susceptible to multiple biases during analysis. Perhaps best expressed by Barab and Squire
(2004), “if a researcher is intimately involved in the conceptualization, design, development,
implementation, and researching of a pedagogical approach, then ensuring that researchers
can make credible and trustworthy assertions is a challenge” (p. 10). Additionally, the
assumption of multiple roles invests much of the design and research in a single person,
diminishing the likelihood of replicability (Hoadley, 2004). Finally, it is impossible to
document or account for all discrete decisions made by the collaborators that influenced the
development and success of the design (Design-Based Research Collective, 2003).
Quality research, though, was never meant to be easy! Despite these challenges, DBR has
still been shown to be effective in simultaneously developing theory through research as
well as interventions that can benefit practice—the two simultaneous goals of any
instructional designer.
The collaborative nature of DBR lends the approach one of its greatest strengths: multiple
perspectives. While this can be a benefit, collaboration between researchers, developers,
and practitioners needs to be highly coordinated (Collins et al., 2004), because it is difficult
to manage interdisciplinary teams and maintain a productive, collaborative partnership
(Design-Based Research Collective, 2003).
Researcher as Theorist
                                              335
               Foundations of Learning and Instructional Design Technology
continually, suspect” (p. 204). This suggests that researchers, despite intentions to test or
build theory, may not design or implement their solution in alignment with theory or provide
enough control to reliably test the theory in question.
Researcher as Designer
Because DBR is simultaneously attempting to satisfy the needs of both design and research,
there is a tension between the responsibilities of the researcher and the responsibilities of
the designer (van den Akker, 1999). Any design decision inherently alters the research.
Similarly, research decisions place constraints on the design. Skilled design-based
researchers seek to balance these competing demands effectively.
Defining Approaches
Similar to DBR, IDR has been subject to competing definitions as varied as the fields in
which design research has been applied (i.e., product design, engineering, manufacturing,
information technology, etc.) (Findeli, 1998; Jonas, 2007; Schneider, 2007). Typically, IDR
scholars have focused on the relationship between design and research, as well as the
underlying purpose, to define the approach. This section identifies three defining
conceptualizations of IDR—the prepositional approach trinity, Cross’s -ologies, and
Buchanan’s strategies of productive science—and discusses possible implications for DBR.
One way of defining different purposes of design research is by identifying the preposition
in the relationship between research and design: research into design, research for design,
and research through design (Buchanan, 2007; Cross, 1999; Findeli, 1998; Jonas, 2007;
Schneider, 2007).
                                             336
               Foundations of Learning and Instructional Design Technology
Research for design applies to complex, sophisticated projects, where the purpose of
research is to foster product research and development, such as in market and user
research (Findeli, 1998; Jonas, 2007). Here, the role of research is to build and improve the
design, not contribute to theory or practice.
According to Jonas’s (2007) description, research through design bears the strongest
resemblance to DBR and is where researchers work to shape their design (i.e., the research
object) and establish connections to broader theory and practice. This approach begins with
the identification of a research question and carries through the design process
experimentally, improving design methods and finding novel ways of controlling the design
process (Schneider, 2007). According to Findeli (1998), because this approach adopts the
design process as the research method, it helps to develop authentic theories of design.
Cross’s-ologies
Cross (1999) conceived of IDR approaches based on the early drive toward a science of
design and identified three bodies of scientific inquiry: epistemology, praxiology, and
phenomenology. Design epistemology primarily concerns what Cross termed “designerly
ways of knowing” or how designers think and communicate about design (Cross, 1999;
Cross, 2007). Design praxiology deals with practices and processes in design or how to
develop and improve artifacts and the processes used to create them. Design
phenomenology examines the form, function, configuration, and value of artifacts, such as
exploring what makes a cell phone attractive to a user or how changes in a software
interface affect user’s activities within the application.
Like Cross, Buchanan (2007) viewed IDR through the lens of design science and identified
four research strategies that frame design inquiry: design science, dialectic inquiry,
rhetorical inquiry, and productive science (Figure 2). Design science focuses on designing
and decision-making, addressing human and consumer behavior. According to Buchanan
(2007), dialectic inquiry examines the “social and cultural context of design; typically
[drawing] attention to the limitations of the individual designer in seeking sustainable
solutions to problems” (p.57). Rhetorical inquiryfocuses on the design experience as well as
the designer’s process to create products that are usable, useful, and desirable. Productive
science studies how the potential of a design is realized through the refinement of its parts,
including materials, form, and function. Buchanan (2007) conceptualized a design
research—what he termed design inquiry—that includes elements of all four strategies,
looking at the designer, the design, the design context, and the refinement process as a
holistic experience.
                                             337
               Foundations of Learning and Instructional Design Technology
                                             338
                Foundations of Learning and Instructional Design Technology
process and yield valuable insights on design thinking and practice. Research for design
could focus on the development of an effective product, which development is missing from
many DBR approaches. Research through design would use the design process as a vehicle
to test and develop theory, reducing the set of expected considerations. Any approach to
dividing or defining DBR efforts could help to limit the focus of the study, helping to prevent
the diffusion of researcher efforts and findings.
Conclusion
In this chapter we have reviewed the historical development of both design-based research
and interdisciplinary design research in an effort to identify strategies in IDR that could
benefit DBR development. Following are a few conclusions, leading to recommendations for
the DBR field.
Overall, one key advantage that IDR has had—and that DBR presently lacks—is
communication and collaboration with other fields. Because DBR has remained so isolated,
only rarely referencing or exploring approaches from other design disciplines, it can only
evolve within the constraints of educational inquiry. IDR’s ability to conceive solutions to
issues in the field is derived, in part, from a wide variety of disciplines that contribute to the
body of research. Engineers, developers, artists, and a range of designers interpose their
own ideas and applications, which are in turn adopted and modified by others. Fostering
collaboration between DBR and IDR, while perhaps not the remedy to cure all scholarly ills,
could yield valuable insights for both fields, particularly in terms of refining methodologies
and promoting the development of theory.
As we identified in this paper, a major issue facing DBR is the proliferation of terminology
among scholars and the inconsistency in usage. From IDR comes the useful
acknowledgement that there can be research into design, fordesign, and through design
(Buchanan, 2007; Cross, 1999; Findeli, 1998; Jonas, 2007; Schneider, 2007). This
framework was useful for scholars in our conversations at the conference. A resulting
recommendation, then, is that, in published works, scholars begin articulating which of
these approaches they are using in that particular study. This can simplify the requirements
on DBR researchers, because instead of feeling the necessity of doing all three in every
paper, they can emphasize one. This will also allow us to communicate our research better
with IDR scholars.
Oftentimes authors publish DBR studies using the same format as regular research studies,
making it difficult to recognize DBR research and learn how other DBR scholars mitigate the
                                               339
               Foundations of Learning and Instructional Design Technology
challenges we have discussed in this chapter. Our recommendation is that DBR scholars
publish the messy findings resulting from their work and pull back the curtain to show how
they balanced competing concerns to arrive at their results. We believe it would help if DBR
scholars adopted more common frameworks for publishing studies. In our review of the
literature, we identified the following characteristics, which are the most frequently used to
identify DBR:
One recommendation is that DBR scholars adopt these as the characteristics of their work
that they will make explicit in every published paper so that DBR articles can be recognized
by readers and better aggregated together to show the value of DBR over time. One
suggestion is that DBR scholars in their methodology sections could adopt these
characteristics as subheadings. So in addition to discussing data collection and data
analysis, they would also discuss Design Research Type (research into, through, or of
design), Description of the Design Process and Product,Design and Learning
Context, Design Collaborations, and a discussion explicitly of the Design Iterations,
perhaps by listing each iteration and then the data collection and analysis for each. Also in
the concluding sections, in addition to discussing research results, scholars would discuss
Applications to Theory (perhaps dividing into Local Theory and Outcomes and
Transferable Theory and Findings) and Applications for Practice. Papers that are too big
could be broken up with different papers reporting on different iterations but using this
same language and formatting to make it easier to connect the ideas throughout the papers.
Not all papers would have both local and transferable theory (the latter being more evident
in later iterations), so it would be sufficient to indicate in a paper that local theory and
outcomes were developed and met with some ideas for transferable theory that would be
developed in future iterations. The important thing would be to refer to each of these main
characteristics in each paper so that scholars can recognize the work as DBR, situate it
appropriately, and know what to look for in terms of quality during the review process.
Application Exercises
          According to the authors, what are the major issues facing DBR and what are
          some things that can be done to address this problem?
          Imagine you have designed a new learning app for use in public schools. How
          would you go about testing it using design-based research?
                                             340
               Foundations of Learning and Instructional Design Technology
References
Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in
education research? Educational Researcher, 41(1), 16–25.
Archer, L.B. (1965). Systematic method for designers. In N. Cross (ed.), Developments in
design methodology. London, England: John Wiley, 1984, pp. 57–82.
Archer, L. B. (1981). A view of the nature of design research. In R. Jacques & J.A. Powell
(Eds.), Design: Science: Method (pp. 36-39). Guilford, England: Westbury House.
Bannan-Ritland, B. (2003). The role of design in research: The integrative learning design
framework. Educational Researcher, 32(1), 21 –24. doi:10.3102/0013189X032001021
Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The
Journal of the Learning Sciences, 13(1), 1–14.
Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in
educational research. Educational Researcher, 32(1), 9–13.
doi:10.3102/0013189X032001009
Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.),
New directions in educational technology. Berlin, Germany: Springer-Verlag.
Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and
methodological issues. The Journal of the Learning Sciences, 13(1), 15–42.
Cross, N. (1999). Design research: A disciplined conversation. Design Issues, 15(2), 5–10.
doi:10.2307/1511837
Cross, N. (2007). Forty years of design research. Design Studies, 28(1), 1–4.
doi:10.1016/j.destud.2006.11.004
                                             341
               Foundations of Learning and Instructional Design Technology
Findeli, A. (1998). A quest for credibility: Doctoral education and research in design at the
University of Montreal. Doctoral Education in Design, Ohio, 8–11 October 1998.
Jonas, W. (2007). Design research and its meaning to the methodological development of the
discipline. In R. Michel (Ed.), Design research now (pp. 187–206). Basel, Switzerland:
Birkhäuser Verlag AG.
Jones, J. C. (1970). Design methods: Seeds of human futures. New York, NY: John Wiley &
Sons Ltd.
Joseph, D. (2004). The practice of design-based research: uncovering the interplay between
design, research, and the real-world context. Educational Psychologist, 39(4), 235–242.
Kelly, A. E. (2003). Theme issue: The role of design in educational research. Educational
Researcher, 32(1), 3–4. doi:10.3102/0013189X032001003
Margolin, V. (2010). Design research: Towards a history. Presented at the Design Research
Society Annual Conference on Design & Complexity, Montreal, Canada. Retrieved from
http://www.drs2010.umontreal.ca/data/PDF/080.pdf
McCandliss, B. D., Kalchman, M., & Bryant, P. (2003). Design experiments and laboratory
approaches to learning: Steps toward collaborative exchange. Educational Researcher,
32(1), 14–16. doi:10.3102/0013189X032001014
Michel, R. (Ed.). (2007). Design research now. Basel, Switzerland: Birkhäuser Verlag AG
Oh, E., & Reeves, T. C. (2010). The implications of the differences between design research
and instructional systems design for educational technology researchers and practitioners.
Educational Media International, 47(4), 263–275.
Reeves, T. C. (2006). Design research from a technology perspective. In J. van den Akker, K.
Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (Vol. 1, pp.
52–66). London, England: Routledge.
Reigeluth, C. M., & Frick, T. W. (1999). Formative research: A methodology for creating and
improving design theories. In C. Reigeluth (Ed.), Instructional-design theories and models. A
new paradigm of instructional theory (Vol. 2) (pp. 633–651), Mahwah, NJ: Lawrence
Erlbaum Associates.
                                             342
               Foundations of Learning and Instructional Design Technology
Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in
context: Introduction. Educational Psychologist, 39(4), 199–201.
Schneider, B. (2007). Design as practice, science and research. In R. Michel (Ed.), Design
research now (pp. 207–218). Basel, Switzerland: Birkhäuser Verlag AG.
Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. (2003). On the science of
education design studies. Educational Researcher, 32(1), 25–28.
doi:10.3102/0013189X032001025
Simon, H. A. (1969). The sciences of the artificial. Cambridge, MA: The MIT Press.
Tabak, I. (2004). Reconstructing context: Negotiating the tension between exogenous and
endogenous educational design. Educational Psychologist, 39(4), 225–233.
van den Akker, J. (1999). Principles and methods of development research. In J. van den
Akker, R. M. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds.), Design approaches and
tools in education and training (pp. 1–14). Norwell, MA: Kluwer Academic Publishers.
van den Akker, J., & Plomp, T. (1993). Development research in curriculum: Propositions
and experiences. Paper presented at the annual meeting of the American Educational
Research Association, April 12–14, Atlanta, GA.
Walker, D. & Bresler, L. (1993). Development research: Definitions, methods, and criteria.
Paper presented at the annual meeting of the American Educational Research Association,
April 12–16, Atlanta, GA.
Willemien, V. (2009). Design: One, but in different forms. Design Studies, 30(3), 187–223.
doi:10.1016/j.destud.2008.11.004
                                              343
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        344
                     Kimberly Christensen
                                    345
                           Richard E. West
He tweets @richardewest, and his research can be found on Google Scholar and
his website: http://richardewest.com.
                                       346
                                           26
James B. Ellsworth
Editor’s Note
    The following was originally published in the public domain as an ERIC digest: “A
    Survey of Educational Change Models. ERIC Digest.”
    [https://edtechbooks.org/-XLS] It is based on Ellsworth’s excellent book, Surviving
    Change: A Survey of Educational Change Models [https://edtechbooks.org/-BI],
    which is available for free online through ERIC.
Change isn’t new, and neither is its study. We have a rich set of frameworks, solidly
grounded in empirical studies and practical applications. Most contributions may be
classified under a set of major perspectives, or “models” of change. These perspectives are
prevalent in the research and combine to yield a 360-degree view of the change process. In
each case, one author or group of authors is selected as the epitome of that perspective
(Ellsworth, 2000). A small group of studies from disciplines outside educational change (in
some cases outside education) also contribute to key concepts not found elsewhere in the
literature.
Everett Rogers, one of the “elder statesmen” of change research, notes that change is a
specialized instance of the general communication model (Rogers, 1995, pp. 5–6). Ellsworth
expands on this notion to create a framework that organizes these perspectives to make the
literature more accessible to the practitioner (Ellsworth, 2000).
                                            347
               Foundations of Learning and Instructional Design Technology
innovation appears to the intended adopter (Ellsworth, p. 26). By uniting these tactics in
service to a systemic strategy, we improve our chances of effective, lasting change.
Anyone trying to improve schools, for example teachers, principals, students, district
administrators, consultants, parents, community leaders, or government representatives
may look to The New Meaning of Educational Change(Fullan & Stiegelbauer, 1991) to
decide where to start (or to stop an inappropriate change).
From there, read Systemic Change in Education (Reigeluth & Garfinkle, 1994), to consider
the system being changed. Consider all assumptions about the nature of that system (its
purpose, members, how it works, its governing constraints, and so forth). Question those
assumptions, to see whether they still hold true. Look inside the system to understand its
subsystems or stakeholders and how they relate to one another and to the system as a
whole. Look outside the system too, to know how other systems (like business or higher
education) are interrelated with it and how it (and these other systems) in turn relate to the
larger systems of community, nation, or human society. The new understanding may
illuminate current goals for the proposed innovation (or concerns for the change you are
resisting) and may indicate some specific issues that may emerge.
This understanding is crucial for diagnosing the system’s needs and how an innovation
serves or impedes them. Now, clearly embarked upon the change process, read a discussion
of that change process in The Change Agent’s Guide (Havelock & Zlotolow, 1995) to guide
and plan future efforts. The Guide serves as the outline for a checklist, to ensure that the
right resources are acquired at the proper time. The Guide will also help you conduct and
assess a trial of the innovation in a way that is relevant and understandable to stakeholders.
It will help extend implementation both in and around the system . . . and it will help to
prepare others within the system to recognize when it is time to change again.
At some point one must commit to a plan, and act. The Concerns-Based Adoption Model
(Hall & Hord, 1987) provides tools to “keep a finger on the pulse” of change and to collect
the information needed. The model’s guidelines help readers to understand the different
concerns stakeholders experience as change progresses. This, in turn, will help readers to
design and enact interventions when they will be most effective.
Even the most effective change effort usually encounters some resistance. Strategies for
Planned Change (Zaltman & Duncan, 1977) can help narrow down the cause(s) of
resistance. Perhaps some stakeholders see the innovation as eroding their status. Possibly
                                             348
               Foundations of Learning and Instructional Design Technology
others would like to adopt the innovation but lack the knowledge or skills to do so.
Opposition may come from entrenched values and beliefs or from lack of confidence that the
system is capable of successful change.
One way to approach such obstacles is to modify or adapt the innovation’s attributes. Even if
the actual innovation cannot be altered, it may be possible to change the perceptions of the
innovation among stakeholders. For example, instead of competing with them, perhaps it is
more appropriately seen as a tool that will help others achieve appropriate goals. Whether
one modifies the attributes or merely their perceptions, Diffusion of Innovations (Rogers,
1995) identifies the ones that are generally most influential and will help readers select an
approach.
[https://edtechbooks.org/-Cjd]
Other obstacles may arise from the environment in which change is implemented. The
“Conditions for Change” (Ely, 1990) can help you address those deficiencies. Possibly a
clearer statement of commitment by top leaders (or more evident leadership by example) is
                                            349
               Foundations of Learning and Instructional Design Technology
needed. Or maybe more opportunity for professional development is required, to help the
stakeholders learn how to use their new tool(s).
Of course, this is not a fixed sequence. Involvement may start when resistance to an
innovation is noticed. If so, begin with Zaltman and Duncan (1977); then turn to Reigeluth
and Garfinkle (1994) to identify the systemic causes of that resistance. If you are an
innovation developer, begin with Rogers (1995), then use the systemic diagnosis in
Reigeluth and Garfinkle to guide selection of the attributes needed for your innovation. The
professional change agent may begin with Havelock and Zlotolow (1995), to plan an overall
change effort. The models are also frequently interrelated.
For example, when modifying innovation attributes pursuant to Rogers (1995), one might
make an IC Component Checklist (see Hall & Hord, 1987) to avoid accidental elimination of
a critical part of the innovation. When assessing the presence or absence of the conditions
for change (Ely, 1990), verify that the systemic conditions mentioned in Reigeluth and
Garfinkle (1994) are present as well. While using the Concerns-Based Adoption Model (Hall
& Hord, 1987) to design interventions aimed at stakeholders at a particular level of use or
stage of concern, consider the psychological barriers to change presented by Zaltman and
Duncan (1977).
Reach out to other disciplines to share experiences and to benefit from theirs. Reach across
to other stakeholders to build the sense of community and shared purpose necessary for the
changes that must lie ahead. The road won’t always be easy, and everyone won’t always
agree which path to take when the road forks . . . but with mutual respect, honest work, and
the understanding that we all have to live with the results, we can get where we need to go.
Succeeding Systematically
The lessons of the classical change models are as valid today—and just as essential for the
change agent to master—as they have ever been. Yet a single innovation (like a new
technology or teaching philosophy) that is foreign to the rest of the system may be rejected,
like an incompatible organ transplant is rejected by a living system. Success depends on a
coordinated “bundle” of innovations—generally affecting several groups of
stakeholders—that results in a coherent system after implementation.
These are exciting times to be a part of education. They are not without conflict . . . but
conflict is what we make of it. Its Chinese ideogram contains two characters: one is
                                              350
               Foundations of Learning and Instructional Design Technology
“danger” and the other “hidden opportunity.” We choose which aspect of conflict—and of
change—we emphasize.
Application Exercises
References
Craig, R. (1996). “The ASTD training and development handbook: A guide to human
resource development.” New York, NY: McGraw-Hill.
Fullan, M., & Stiegelbauer, S. (1991). “The new meaning of educational change.” New York,
NY: Teachers College Press. (ED 354 588)
Hall, G., & Hord, S. (1987). “Change in schools: Facilitating the process.” Albany, NY: State
University of New York Press. (ED 332 261)
Havelock, R., & Zlotolow, S. (1995). “The change agent’s guide,” (2nd ed.). Englewood
Cliffs, NJ: Educational Technology Publications. (ED 381 886)
Reigeluth, C., & Garfinkle, R. (1994). “Systemic change in education.” Englewood Cliffs, NJ:
Educational Technology Publications. (ED 367 055)
                                             351
               Foundations of Learning and Instructional Design Technology
Rogers, E.M. (1995). “Diffusion of innovations,” (4th ed.). New York, NY: The Free Press.
Zaltman, G., & Duncan, R. (1977). “Strategies for planned change.” New York, NY: John
Wiley and Sons.
                                            352
          Foundations of Learning and Instructional Design Technology
Suggested Citation
               CC0: This chapter is in the public domain, which means that you
               may print, share, or remix its contents as you please without
concern for copyright and without seeking permission.
                                       353
                         James B. Ellsworth
Dr. James B. Ellsworth joined the faculty of the U.S. Naval War College in the
summer of 2000, following a variety of assignments in Armor and Intelligence
education and training. He completed cadet training with the Civil Air Patrol
(USAF Auxiliary), earning the rank of cadet colonel, and he is a distinguished
graduate of the Army’s Military Intelligence Officer Advanced Course and the
Navy’s College of Naval Warfare. He holds a bachelor’s degree from Clarkson
University, a master’s from the Naval War College, and master’s and doctoral
degrees from Syracuse University. His specialties include Military Intelligence,
Information Operations and Defense Transformation, and he teaches the
resident elective on the Future of Armed Conflict.
                                       354
                                            27
Performance Technology
Jill Stefaniak
The goal for all instructional designers is to facilitate learning and improve performance
regardless of learning environments and assigned tasks. When working within professional
organizations particularly, the goal is often to develop interventions that yield measurable
outcomes in improving employee performance. This may be accomplished through
conducting needs assessments and learner analyses, designing and developing instructional
materials to address a gap in performance, validating instructional materials, developing
evaluation instruments to measure the impact of learning, and conducting evaluations to
determine to what extent the instructional materials have met their intended use.
                                            355
               Foundations of Learning and Instructional Design Technology
ISPI Standards
Instructional designers should recognize that they perform a number, if not all, of these
standards in their assigned projects. However, there are subtle but important differences
between performance technology/improvement and instructional design. This chapter
presents an overview for how instructional designers can use performance analysis and non-
instructional interventions. . It also discusses how a relationship between instructional
design and human performance technology can leverage the impact of instructional design
activities. It concludes with an overview of professional resources available related to the
topic of human performance technology.
                                            356
               Foundations of Learning and Instructional Design Technology
When differentiating between human performance technology and instructional design, HPT
focuses on applying systematic and systemic processes throughout a system to improve
performance. Emphasis is placed on analyzing performance at multiple levels within an
organization and understanding what processes are needed for the organization to work
most effectively. Systemic solutions take into account how the various functions of an
organization interact and align with one another. Through organizational analyses,
performance technologists are able to identify gaps in performance and create systematic
solutions (Burner, 2010).
While instruction may be one of the strategies created as a result of a performance analysis,
it is often coupled with other non-instructional strategies. Depending on an instructional
designer’s role in a project or organization, they may not be heavily involved in conducting
performance assessments. When given the opportunity, it is good practice to understand
how performance is being assessed within the organization in order to align the
instructional solutions with other solutions and strategies.
While human performance technology and instructional design have two different
emphases, they do share four commonalities: (1) evidence-based practices, (2) goals,
standards, and codes of ethics, (3) systemic and systematic processes, and (4) formative,
summative, confirmative evaluations (Foshay, Villachica, Stepich, 2014). Table 1 provides an
overview of how these four commonalities are applied in human performance technology
and instructional design.
Table 1
Four commonalities shared across human performance technology and instructional design
                        Human Performance
Commonalities                                                Instructional Design
                        Technology
                        Organizational analyses are
                                                             Emphasis is placed on learner
Evidence-based          conducted to collect data from
                                                             assessment to ensure
practices               multiple sources to evaluate
                                                             instruction has been successful.
                        workplace performance.
                      ISPI and ATD are two professional
                                                             AECT and ATD are two
                      organizations that have created
Goals, standards, and                                        professional organizations that
                      workplace standards and
codes of ethics                                              have created standards for
                      professional certification
                                                             learning and performance.
                      programs.
                        Systematic frameworks have been
                                                             Systematic instructional design
                        designed to conduct needs
Systematic and                                               models have been designed to
                        assessments and other
systemic processes                                           guide the design of instruction
                        performance analyses throughout
                                                             for a variety of contexts.
                        various levels of an organization.
                                            357
               Foundations of Learning and Instructional Design Technology
In general terms, a “system is a set of objects together with relationships between the
objects and between their attributes” (Hall & Fagen, 1975, p. 52). Systems can be open or
closed (Bertalanffy, 1968). Open systems operate in a manner where they rely on other
systems or can be modified based on actions occurring outside of a system. Closed systems
are contained and can demonstrate resistance to changes or actions occurring outside the
system in order to keep their value (Richey et al., 2011). Examples of systems could include
the instructional design or training department within a larger organization. While the
department is a system, it is also viewed as a subsystem functioning within something much
larger. In addition, those receiving human performance training also work within systems.
For example, an instructional designer may be asked to provide training based on values
espoused by the CEO, but which may conflict with culture within an individual department
in the organization. Other times, they may be asked to identify other instructional solutions
to address performance gaps identified in a needs assessment. Or they may seek to improve
employees’ performance in one area, when that performance depends on the success of
another department in the organization—something outside of the employees’ control. Thus,
seeking to improve organizational performance requires a broader understanding of the
organization than is sometimes typical in instructional design practices.
   1. “It is holistic.
   2. It focuses primarily on the interactions among the elements rather than the elements
      themselves.
                                            358
               Foundations of Learning and Instructional Design Technology
   3. It views systems as “nested” with larger systems made up of smaller ones” (Foshay et
      al., 2014, p. 42).
These characteristics affect instruction design practices in a variety of ways. Designers must
take the holistic nature of the system and consider the effects on learning from all elements
that exist within the system. Not only does this consider the specific instructional design
tasks that learners are currently completing, but also various layers of the organization
including the people, politics, organizational culture, and resources—in other words, the
inputs and outputs that are driving the development and implementation of a project
(Rummler & Brache, 2013). Regardless of their role on a project, the instructional designer
must be aware of all the various components within their system and how it affects the
instruction they create. For example, an instructional designer may be asked by senior
leadership of an organization to develop health and safety training for employees working
on the frontline of a manufacturing plant. It would be advantageous to understand the
unique tasks and nuances associated with the frontline work responsibilities to ensure they
are developing training that will be beneficial to the employees. Another example where it
would be important for an instructional designer to be aware of an organization’s system or
subsystems would be if they were asked to design instruction for a company that has
multiple locations across the country or world. The instructional designer should clarify
whether or not there are distinct differences (i.e. organizational culture, politics, processes)
among these various locations and how these differences may impact the results of training.
In addition, considering that the fundamental goals of instructional design are to facilitate
learning and improve performance, the instructional designer working within organizations
should strive to create design solutions that promote sustainability. As stated by the second
systems characteristic, it is important to not only be aware of the various elements within a
system, but also develop an understanding of how they interact with each other. The
instructional designer should be aware of how their work may influence or affect, positively
or negatively, other aspects of the organization. For example, if an organization is preparing
to launch training on a new organizational philosophy, how will that be perceived by other
departments or divisions within the organization? If an organization is changing their
training methods from instructor-led formats to primarily online learning formats, what
considerations must the instructional design team be aware of to ensure a smooth
transition? Does the organization have the infrastructure to support online learning for the
entire organization? Is the information technology department equipped with uploading
resources and managing any technological challenges that may arise over time? Does the
current face to face training provide opportunities for relationship-building that may not
seem critical to the learning, but are important to the health and performance of the
organization? If so, how can this be accounted for online? These are examples of some
questions an instructional designer may ask in order to take a broader view of their
instruction besides just whether it achieves learning outcomes.
                                              359
               Foundations of Learning and Instructional Design Technology
Performance Analysis
Regardless of context or industry, all instructional design projects fulfill one of three needs
within organizations: (1) addressing a problem; (2) embracing quality improvement
initiatives; and (3) developing new opportunities for growth (Pershing, 2006). The
instructional designer must be able to validate project needs by effectively completing a
performance analysis to understand the contextual factors contributing to performance
problems. This allows the instructional designer to appropriately identify and design
solutions that will address the need in the organization—what is often called the
performance gap or opportunity.
Performance analysis can occur in multiple ways, focusing on the organization as a whole or
one specific unit or function. Organizational analysis consists of “an examination of the
components that strategic plans are made of. This phase analyzes the organization’s vision,
mission, values, goals, strategies and critical business issues” (Van Tiem et al., 2012, p.
133). Items that are examined in close detail when conducting an organizational analysis
include organizational structure, centrally controlled systems, corporate strategies, key
policies, business values, and corporate culture (Tosti & Jackson, 1997). All of these can
impact the sustainability of instructional design projects either positively or negatively.
                                              360
               Foundations of Learning and Instructional Design Technology
types of performance analyses may influence their work. Whether an analysis is limited to
individual performance, organizational performance, or environmental performance, they all
seek to understand the degree to which elements within the system are interacting with one
another. These analyses vary in terms of scalability and goals. Interactions may involve
elements of one subsystem of an organization or multiple subsystems (layers) within an
organization. For example, an instructional design program would be considered a
subsystem of a department with multiple programs or majors. The department would be
another system that would fall under a college, and a university would be comprised of
multiple colleges, each representing a subsystem within a larger system.
Cause Analysis
A large part of human performance technology is analyzing organizational systems and work
environments to improve performance. While performance analysis helps to identify
performance gaps occurring in an organization, it is important to identify the causes that
are contributing to those performance gaps. The goal of cause analysis is to identify the root
causes of performance gaps and identify appropriate sustainable solutions.
While conducting a cause analysis, a performance technologist will consider the severity of
the problems or performance gaps, examine what types of environmental supports are
currently in place (i.e. training, resources for employees) and skillsets of employees (Gilbert,
1978). The performance technologist engages in troubleshooting by examining the problem
from multiple viewpoints to determine what is contributing to the performance deficiencies
(Chevalier, 2003).
Non-instructional Interventions
Once a performance technologist has identified the performance gaps and opportunities,
they create interventions to improve performance. “Interventions are deliberate, conscious
acts that facilitate change in performance” (Van Tiem, Moseley, & Dessinger, 2012, p. 195).
Interventions can be classified as either instructional or non-instructional. Table 2 provides
an overview of the various types of interventions common to instructional design practice.
Table 2
                                              361
               Foundations of Learning and Instructional Design Technology
Table 3
Non-instructional strategies
Non-Instructional
                            Benefit to the Instructional Design Process
Strategies
                            Up to date job descriptions with complete task analyses will
Job analysis                provide a detailed account for performing tasks conveyed in
                            training.
                            A plan that outlines the organizational infrastructure of a
Organizational design       company. Details are provided to demonstrate how different
                            units interact and function with one another in the organization.
                            Plans that detail how new initiatives or information is
                            communicated to employees. Examples may include listservs,
Communication planning
                            company newsletters, training announcements, performance
                            reviews, and employee feedback.
                            Detailed plans to provide employees feedback on their work
Feedback systems            performance. This information may be used to identify
                            individual training needs and opportunities for promotion.
                            Installation of learning management systems to track learning
                            initiatives throughout the organization. Electronic performance
Knowledge management
                            support systems are used to provide just-in-time resources to
                            employees.
                                             362
               Foundations of Learning and Instructional Design Technology
Organizational design and job analysis are two non-instructional interventions that
instructional designers should be especially familiar with especially, if they are involved
with projects that will result in large scale changes within an organization. They should have
a solid understanding of the various functions and departments within the organization and
the interactions that take place among them. Organizational design involves the process of
identifying the necessary organizational structure to support workflow processes and
procedures (Burton, Obel, & Hakonsson, 2015). Examples include distinguishing the roles
and responsibilities to be carried out by individual departments or work units, determining
whether an organization will have multiple levels of management or a more decentralized
approach to leadership, and how these departments work together in the larger system.
Job analyses are another area that can affect long term implications of instructional
interventions. A job analysis is the process of dissecting the knowledge, skills, and abilities
required to carry out job functions listed under a job description (Fine & Getkate, 2014).
Oftentimes, a task analysis is conducted to gain a better understanding of the minute details
of the job in order to identify what needs to be conveyed through training (Jonassen,
Tessmer, & Hannum, 1999). If job analyses are outdated or have never been conducted,
there is a very good chance that there will be a misalignment between the instructional
materials and performance expectations, thus defeating the purpose of training.
Feedback systems are often put in place by organizations to provide employees with a frame
of reference in regards to how they are performing in their respective roles (Shartel, 2012).
Feedback, when given properly, can “invoke performance improvement by providing
performers the necessary information to modify performance accordingly” (Ross &
Stefaniak, 2018, p. 8). Gilbert’s (1978) Behavioral Engineering Model is a commonly
referenced feedback analysis tool used by practitioners to assess performance and provide
feedback as it captures data not only at the performer level but also at the organizational
level. This helps managers and supervisors determine the degree of alignment between
various elements in the organization impacting performance (Marker, 2007).
Other examples of supportive systems could also include communities of practice and social
forums where employees can seek out resources on an as needed basis. Communities of
practice are used to bring employees or individuals together who perform similar tasks or
                                             363
               Foundations of Learning and Instructional Design Technology
have shared common interests (Davies et al., 2017; Wenger, 2000; Wenger, McDermott, &
Snyder, 2002).     When selecting an intervention, it is important to select something that is
going to solve the problem or address a particular need of the organization. Gathering
commitment from leadership to implement the intervention and securing buy-in from other
members of the organization that the intervention will work is also very important (Rummler
& Brache, 2013; Spitzer, 1992; Van Tiem et al., 2012).
                                             364
               Foundations of Learning and Instructional Design Technology
and actions. By recognizing the systemic implications of their actions, they may be more
inclined to implement needs assessment and evaluation processes to ensure they are
addressing organizational constraints while adding value. With the growing emphasis of
design thinking in the field of instructional design, we, as a field, are becoming more open
to learning about how other design fields can influence our practice (i.e. graphic design,
architecture, and engineering), and human performance, as another design field in its own
right, is one more discipline that can improve how we do our work as instructional
designers.
Professional Resources
There are a variety of resources available for instructional designers who are interested in
learning more about how they can utilize concepts of human performance technology in
their daily practice. This section provides an overview of professional associations, journals,
and important books related to the field.
Professional Associations
Founded in 1943, The Association for Talent Development [formerly known as the American
Society for Training and Development (ASTD)], is the largest professional organization for
workplace learning and performance. Similar to ISPI, they also have local chapters in most
of the United States. Their members are comprised of instructional designers, performance
consultants, talent development managers, and workplace learning professionals (ATD,
n.d.), representing more than 120 countries and industries of all sizes. ATD also offers a
certification for individuals interested in workplace learning and performance through their
Certified Professional in Learning and Performance (CPLP) designation.
                                              365
               Foundations of Learning and Instructional Design Technology
All of the abovementioned organizations host annual conferences that offer workshops,
presentations, and discussions on a variety of topics related to workplace performance,
performance improvement, and instructional design. More information about each of the
professional organizations discussed in this section can be found online at:
Association for Talent Development (ATD) http://atd.org
Association for Educational Communications and Technology (AECT) http://aect.org
International Society for Performance Improvement (ISPI) http://ispi.org
Books
      Gilbert, T.F. (1978). Human competence: Engineering worthy performance. New York,
      NY: McGraw-Hill.
      Moseley, J.L., & Dessinger, J.C. (2010). Handbook of improving performance in the
      workplace. Volume 3: Measurement and evaluation. San Francisco, CA: Pfeiffer.
      Pershing, J.A. (2006). Handbook of human performance technology (3rd ed.). San
      Francisco, CA: Pfeiffer.
      Rossett, A. (1999). First things fast: A handbook for performance analysis. San
      Francisco, CA: Pfeiffer.
      Rummler, G. A., & Brache, A. P. (2013). Improving performance: How to manage the
      white space on the organization chart (3rd ed.). San Francisco, CA: Jossey-Bass.
      Silber, K.H., & Foshay, W.R. (2010). Handbook of improving performance in the
      workplace. Volume 1: Instructional design and training delivery. San Francisco, CA:
      Pfeiffer.
      Stefaniak, J. (Ed.). (2015). Cases on human performance improvement technologies.
      Hershey, PA: IGI Global.
      Van Tiem, D., Moseley, J.L., & Dessinger, J.C. (2012). Fundamentals of performance
      improvement: A guide to improving people, process, and performance (3rd ed.). San
      Francisco, CA: Pfeiffer.
      Watkins, R., & Leigh, D. (2010). Handbook of improving performance in the
      workplace. Volume 2: Selecting and implementing performance interventions. San
      Francisco, CA: Pfeiffer.
Journals
While a number of instructional design journals will publish articles on trends related to the
performance improvement, the following is a list of academic journals focused specifically
on the mission of human performance technology:
                                             366
               Foundations of Learning and Instructional Design Technology
Additional Reading
References
Association for Educational Communications and Technology (n.d.). Organizational training
and performance. Retrieved from https://aect.org/ on August 1, 2018.
Association for Talent Development (n.d.). Retrieved from https://atd.org/ on August 1, 2018.
Burner, K.J. (2010). From performance analysis to training needs assessment. In K.H. Silber,
W.R. Foshay (Eds.), Handbook of improving performance in the workplace: Instructional
design and training delivery (vol. 1, pp. 144-183). San Francisco: Pfeiffer.
Burton, R. M., Obel, B., & Håkonsson, D. D. (2015). Organizational design: A step-by-step
approach. London: Cambridge University Press.
Davies, C., Hart, A., Eryigit-Madzwamuse, S., Stubbs, C., Aumann, K., Aranda, K. (2017).
Communities of practice in community-university engagement: Supporting co-productive
resilience research and practice. In J. McDonald, A. Cater-Steel (Eds.), Communities of
                                            367
               Foundations of Learning and Instructional Design Technology
practice: Facilitating social learning in higher education (pp. 175-198). New York, NY:
Springer.
Dick, W., Carey, L., & Carey, J.O. (2009). The systematic design of instruction (7th ed.).
Upper Saddle River, NJ: Pearson.
Fine, S. A., & Getkate, M. (2014). Benchmark tasks for job analysis: A guide for functional
job analysis (FJA) scales. New York, NY: Psychology Press.
Foshay, W.R., Villachica, S.W., & Stepich, D.A. (2014). Cousins but not twins: Instructional
design and human performance technology in the workplace. In J.M. Spector, M.D. Merrill,
J.
Elen, & M.J. Bishop (Eds.), Handbook of research on educational communications and
technology (4th ed., pp. 39-49). New York, NY: Springer.
Gilbert, T.F. (1978). Human competence: Engineering worthy performance. New York, NY:
McGraw-Hill.
Hall, A.D., & Fagen, R.E. (1975). Definition of system. In B.D. Ruben & J.Y. Kin (Eds.),
General systems theory and human communications (pp. 52-65). Rochelle Park, NJ: Hayden
Book Company, Inc.
Jonassen, D.H., Tessmer, M., Hannum, W.H. (1999). Task analysis methods for instructional
design. New York, NY: Routledge.
Mager, R.F., & Pipe, P. (1970). Analyzing performance problems: Or you really oughta
wanna. Belmont, CA: Fearson.
Marker, A.. (2007). Synchronized analysis model: Linking Gilbert's behavioral engineering
model with environmental analysis models. Performance Improvement, 46(1), 26-32.
Pershing, J.A. (2006). Human performance technology fundamentals. In J.A. Pershing (Ed.),
Handbook of human performance technology (3rd ed., pp. 5-26). San Francisco: Pfeiffer.
Richey, R.C., Klein, J.D., & Tracey, M.W. (2011). The instructional design knowledge base:
Theory, research, and practice. New York, NY: Routledge.
                                             368
               Foundations of Learning and Instructional Design Technology
Rummler, G.A. (1972). Human performance problems and their solutions. Human Resource
Management, 11(4), 2-10.
Ross, M., & Stefaniak, J. (2018). The use of the behavioral engineering model to examine the
training and delivery of feedback. Performance Improvement, 57(8), p. 7-20.
Rummler, G.A. (2006). The anatomy of performance: A framework for consultants. In J.A.
Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 986-1007). San
Francisco, CA: Pfeiffer.
Rummler, G. A., & Brache, A. P. (2013). Improving performance: How to manage the white
space on the organization chart (3rd ed.). San Francisco, CA: Jossey-Bass.
Schartel, S.A. (2012). Giving feedback—An integral part of education. Best Practice &
Research Clinical Anaesthesiology, 26(1), 77-87.
Smith, P.A., & Ragan, T.L. (2005). Instructional design (3rd ed.). Hoboken, NJ: Wiley.
Spitzer, D.R. (1992). The design and development of effective interventions. In H.D.
Stolovitch & E.J. Keeps (Eds.), Handbook of human performance technology (pp. 114-129).
San Francisco, CA: Pfeiffer.
Surry, D. W., & Stanfield, A. K. (2008). Performance technology. In M. K. Barbour & M. Orey
(Eds.), The Foundations of Instructional Technology. Retrieved
from https://edtechbooks.org/-cx
Van Tiem, D., Moseley, J.L., & Dessinger, J.C. (2012). Fundamentals of performance
improvement: A guide to improving people, process, and performance (3rd ed.). San
Francisco, CA: Pfeiffer.
Wenger, E. (2000). Communities of practice and social learning systems. Organization, 7(2),
225-246.
Wenger, E., McDermott, R. A., & Snyder, W. (2002). Cultivating communities of practice: A
guide to managing knowledge. Boston, MA: Harvard Business Press.
                                            369
          Foundations of Learning and Instructional Design Technology
Feedback
Suggested Citation
                                        370
                             Jill Stefaniak
                                      371
                                            28
Tonia A. Dousay
Editor’s Note
                                             372
               Foundations of Learning and Instructional Design Technology
with others when making, hacking, or tinkering creates opportunities for collaborative
learning through problem and/or project-based means. In the midst of this booming informal
learning phenomenon, however, it provides a prime opportunity for PK-12 schools and
universities to explore the implications for formal learning.
While the maker movement itself seems broad and general, the physical spaces that host
making activities are even more varied. “Makerspaces are informal sites for creative
production in art, science, and engineering where people of all ages blend digital and
physical technologies to explore ideas, learn technical skills, and create new products”
(Sheridan et al., 2014, p. 505). The emphasis on the informal and exploration translates to a
wide variety of spaces including public libraries, community centers, PK-12 classrooms, and
even university facilities. From the Chicago Public Library’s Maker Lab to the Creation
Station at the St. Helena branch of Beaufort County Library in South Carolina (Ginsberg,
2015), public libraries large and small see the potential of and value in making. One of the
oldest community spaces, The Geek Group out of Grand Rapids, MI, began as a
collaboration between community members and Grand Valley State University, in the mid
1990s to facilitate innovation, play, and learning. Now a global group with more than 25,000
members, the primary facility provides space for members “to build projects and prototypes,
collaborate with other members, take classes in their areas of interest, and teach classes in
their areas of expertise” (The Geek Group, n.d., para. 4). Each of these spaces, regardless of
host, include diverse approaches to equipment, space, and activities that operate in the
spirit of the maker movement.
                                             373
               Foundations of Learning and Instructional Design Technology
almost universally provided on a volunteer basis, but some spaces also call upon volunteers
to manage daily operations. Access to spaces also takes on different forms. Some spaces
allow open access to a community, accepting donations and/or charging for specific services
like 3D printing or use of equipment. Other spaces are open on a membership basis,
meaning that an individual or family pays a monthly or yearly fee for open access to all
expertise, tools, and materials. Age of the visitors may also be a consideration, ranging from
inviting members of all ages to focusing on specific age groups. Still other spaces are
completely closed, only allowing access to a specific group of individuals. Spaces in PK-12
schools often fall in this latter classification, only allowing currently enrolled students to
access the makerspace. Some spaces focus efforts at specific points along a virtual spectrum
while others work at multiple points on the spectrum. Figure 1 illustrates this
multidimensional framework for profiling these characteristics.
Each line represents a spectrum along which a space may operate, either by initial setup
and design or through evolving changes. The spiral that swirls around the axis represents a
                                             374
               Foundations of Learning and Instructional Design Technology
Using this framework to define a particular makerspace may help stakeholders evaluate
immediate and long-term needs and capabilities. Whether a space is in the early conception
phase or assessing continued operation, the framework guides decision-making questions
that inform budget, infrastructure, personnel, and more. Some of the questions may be
easier to answer than others, depending upon particular circumstances. Many PK-12 schools
may prefer to operate a closed facility that only serves enrolled students while college and
                                             375
                Foundations of Learning and Instructional Design Technology
university makerspaces must decide between opening a space for the community or
restricting access. Even community spaces must consider their purpose and mission in
conjunction with sponsoring agencies to make a similar decision. The access decision
informs budgetary concerns when assessing the cost of consumable materials (filament for a
3D printer) and equipment maintenance (blades for silhouette cutters). When considering
the funding question, sponsorships must be taken into consideration. Makerspaces that
receive funding from a parent organization such as municipal or taxpayer funds may have
guidelines on how money can be spent, but still operate in an open fashion. Other examples
of sponsorships include competitions such as the CTE Challenge from the U.S. Department
of Education (2016) or even corporations like MakerBot. Alternatively, a community space
with open access with no primary budget provider would be well served charging a
membership fee or offering specific access or services on a fee-for-use basis. Staffing
decisions in a space involve questions such as “do individuals need a particular credential to
work here?” A closed-access space built into the curriculum of a PK-12 school likely requires
staffers to be faculty employed by the school with an endorsement in a particular subject
area; whereas an open access university space may allow anyone in the community to work
there. Lastly, and likely the most fluid of all decisions, is that of what tools and technology to
provide in expertise and/or access. Some makerspaces have found success starting with
digital tools more readily available as they draft growth plans and seek the funding or other
means to acquire resources to expand. The decision related to the tools and activities
available in a space should also take into consideration the expertise of paid and volunteer
staff. Thus, working through each of the primary elements of the framework will inform
multiple operating decisions.
Learning in Makerspaces
Regardless of how the space is designed or how it shifts over time, the learning that occurs
in a makerspace takes on both formal and informal elements. Just as the spiral in Figure 1
illustrates the multidimensional quality, it also embodies the learning that occurs in the
space with the assistance of each spectrum. Guidelines of makerspaces include accepting
and learning from failure, encouraging experimentation, supporting unintentional
consequences of damage to equipment, and facilitating collaboration (Kurti, Kurti, &
Fleming, 2014). Those familiar with the various iterations and implementations with what
many refer to as career and technology education (CTE) programs likely see resemblances
between some makerspaces and the shop and fabrication spaces often encountered with
workshop-based learning environments that specialize in skilled trades and equipment
operation (Great Schools Partnership, 2014). The differentiation and perhaps most
significant distinction rests in the ability of a makerspace to shift from completely closed
access to inviting external expertise as well as the learning from failure entrepreneurial
spirit and fluid, evolving nature. Communities and schools that attempt to create a
makerspace or rebrand an existing facility under the assumption that a space can only exist
in their context if it contains a specific list of equipment aligned directly to scripted
curriculum suffer from a narrow view of the maker movement and embodied character.
                                               376
               Foundations of Learning and Instructional Design Technology
With the rise in attention to and popularity of the maker movement, scholars and
practitioners have rushed to capitalize on this momentum under the guise of everything
from educational reform to salvaging educational facilities and programs. Economic
downturns and renewed interest in the DIY approach and hands-on construction helped
introduce and even foster maker movement growth (Lahart, 2009). While leaders have
argued about the role of creativity in the classroom and grappled with reconciling
educational policies and mandates with curricular strategies and assessment, communities
embraced centers of informal learning engagement. Teachers and schools then began to see
the makerspace as a means to reinvent curriculum through a constructionist paradigm
(Donaldson, 2014) with the ability to promote learning and innovation skills such as the
Partnership for 21st Century Learning’s (2011) 4Cs — critical thinking, communication,
collaboration, and creativity. However, given the movement’s deep connection to informal
learning, school-based makerspaces must now consider how to blend this approach with
formal learning.
Makerspaces inherently hold potential for learning through varied drop-in or scheduled
activities and clubs, but the informal emphasis poses challenges to educators. As Kurti,
Kurti, and Fleming (2014) noted, “Learning may occur, but it is not the primary objective”
(p. 8). An individual’s primary objective or purpose in a makerspace varies as widely as the
different types of spaces that exist. The driving informal learning factor that occurs rests in
a learner-centered approach. In other words, the learner determines what activity he or she
wants to undertake, triggering a self-regulated learning phenomenon wherein the the
individual serves as the primary driver of all actions based upon intrinsic motivation
(Zimmerman, 1986). All knowledge and skill necessary to complete that activity then
become the responsibility of the learner, and he or she must seek out the expertise and
resources to help complete the activity. This expertise takes the form of volunteered and
paid staffers, both in residence and invited. The makerspace also facilitates access to other
resources such as online videos or tutorials, consumable materials necessary for the
activity, and safety support. Education researchers easily look at this specific learning
environment and draw parallels between informal makerspace learning and problem-based
learning (PBL). At the very core of a makerspace lies an ill-structured problem, a learner
wanting to learn a new skill or create something he or she has never attempted before, with
many ways to approach and solve the problem, and this aligns perfectly with Jonassen’s
(2000) definition of PBL. Even the individual variable of engaging in self-directed learning
sits as a cornerstone to PBL design (Scott, 2014). Although many educators and researchers
have attempted to encourage and foster PBL curriculum in PK-12 schools (Gallagher, 1997;
Hmelo- Silver, 2004; Ward & Lee, 2002), actual implementation has been difficult (Brush &
Saye, 2000; Ertmer & Simons, 2006; Frykholm, 2004). Among the issues recognized by
teachers as a barrier to adopting PBL, Ertmer, et al. (2009) found that allowing students to
take responsibility for their learning and effectively integrating technology tools arose as a
common theme. Perhaps then, the makerspace approach provides a means for clearing this
challenge and making PBL easier to adopt while simultaneously bridging the formal and
                                             377
               Foundations of Learning and Instructional Design Technology
informal.
Referring back to the proposed framework in Figure 1 and previous makerspace profiles
                                            378
               Foundations of Learning and Instructional Design Technology
discussed, consider the relationship between a closed-access space tied to a specific course
or set of courses/curriculum. The earlier comparison with CTE facilities takes on an old
perspective. Makerspaces in a community thrive due to their fluid and evolving nature that
invites interdisciplinary collaboration. CTE facilities, arguably, fell out of favor with formal
education through budget cuts and policies or practices that only allowed certain subgroups
of students to use the space. Example policies and programs include college-bound or
career-ready tracks of courses prescribed for students. In some cases, these practices
segregated students, emphasizing post-secondary education over vocational opportunities
(Baxter, 2012). If schools adopt a proactive stance of applying open philosophies,
collaborative management and use, and integrated project facilitation, they may be able to
sustain the maker movement beyond initial hype and implementation. How a PK-12 school
approaches these characteristics likely depends largely on breaking down subject-based
silos and inviting teachers and staff to experiment with team-teaching or guest teaching in
an interdisciplinary PBL approach, which is not uncommon in some post-secondary
classrooms. However, adopting such an approach forces school administrations to also shift
from a low-risk mindset to one that encourages risk-taking and nontraditional systems
thinking. If a school considers creating a makerspace or rejuvenating an existing curricular
program, they would be mindful to heed this guidance and incorporate sustainability into
each characteristic of the framework in Figure 1 or suffer the hazard of watching their
space quickly become obsolete. To truly attain sustainability and not be considered the next
generation of obsolete computer labs or workshops, school-based makerspaces must
continuously evaluate and evolve.
References
3dprintler. (2014). 3Dponics – open-source collaboration – Building the ultimate space farm.
Retrieved from http://www.thingiverse.com/thing:376158
Brush, T., & Saye, J. (2000). Implementation and evaluation of a student-centered learning
unit: A case study. Educational Technology Research and Development, 48(3), 79–100.
doi:10.1007/BF02319859
Donaldson, J. (2014). The maker movement and the rebirth of constructionism. Hybrid
Pedagogy. Retrieved from https://edtechbooks.org/-kk
Ertmer, P. A., Glazewski, K. D., Jones, D., Ottenbreit-Leftwich, A. T., Collins, K., & Kocaman,
A. (2009). Facilitating technology-enhanced Problem-Based Learning (PBL) in the middle
school classroom: An examination of how and why teachers adapt. Journal of Interactive
Learning Research, 20(1), 35–54.
Ertmer, P., & Simons, K. D. (2006). Jumping the PBL implementation hurdle: Supporting the
efforts of K–12 teachers. Interdisciplinary Journal of Problem-Based Learning, 1(1), 40–54.
                                              379
               Foundations of Learning and Instructional Design Technology
doi:10.7771/1541-5015.1005
Frykholm, J. (2004). Teachers’ tolerance for discomfort: Implications for curricular reform in
mathematics. Journal of Curriculum and Supervision.
Gallagher, S. A. (1997). Problem-based learning: Where did it come from, what does it do,
and where is it going? Journal for the Education of the Gifted, 20(4), 332–362.
Great Schools Partnership. (2014). Career and technical education. The Glossary of
Education Reform. Retrieved from http://edglossary.org/career-and-technical-education/
[https://edtechbooks.org/-bP]
Kurti, R. S., Kurti, D., & Fleming, L. (2014). The philosophy of educational makerspaces:
Part 1 of making an educational makerspace. Teacher Librarian, 41(5), 8–11.
Lahart, J. (2009, November 13). Tinkering makes comeback amid crisis. The Wall Street
Journal. Retrieved from https://edtechbooks.org/-Vn
Minsker, E. (2009, March 9). Hacking Chicago. The Columbia Chronicle. Retrieved from
https://edtechbooks.org/-Ik
Morin, B. (2013, May 2). What is the maker movement and why should you care? Huffington
Post. Retrieved from https://edtechbooks.org/-yu
Papert, S., & Harel, I. (1991). Situating constructionism. In Constructionism (pp. 1–11). New
York: Ablex Publishing Corporation.
Partnership for 21st Century Skills. (2011). Framework for 21st century learning.
Washington, D.C. Retrieved from http://www.p21.org
                                             380
               Foundations of Learning and Instructional Design Technology
Sheridan, K. M., Halverson, E. R., Litts, B. K., Brahms, L., Jacobs-Priebe, L., & Owens, T.
(2014). Learning in the making: A comparative case study of three makerspaces. Harvard
Educational Review, 84(4), 505–532. doi:10.17763/haer.84.4.brr34733723j648u
Ward, J. D., & Lee, C. L. (2002). A review of problem-based learning. Journal of Family and
Consumer Science Education, 20(1), 16–26.
Suggested Citation
                                             381
       Foundations of Learning and Instructional Design Technology
                                   382
                          Tonia A. Dousay
                                     383
                                            29
Educators and learners are increasingly reliant on digital tools to facilitate learning.
However, educators and learners often fail to adopt technology as originally intended
(Straub, 2017). For instance, educators may be faced with challenges trying to determine
how to assess student learning in their learning management system or they might spend
time determining workarounds to administer lesson plans. Learners might experience
challenges navigating an interface or finding homework details. When an interface is not
easy to use, a user must develop alternative paths to complete a task and thereby
accomplish a learning goal. Such challenges are the result of design flaws, which create
barriers for effective instruction (Jou, Tennyson, Wang, & Huang, 2016; Rodríguez, Pérez,
Cueva, & Torres, 2017).
Understanding how educators and learners interact with learning technologies is key to
avoiding and remediating design flaws. An area of research that seeks to understand the
interaction between technology and the people who use it is known as human-computer
interaction (HCI; Rogers, 2012). HCI considers interaction from many perspectives, two of
which are usability and user experience (UX). Usability describes how easily the interfaces
are able to be used as intended by the user (Nielsen, 2012). Examples include when an
interface is designed in such a way that the user can anticipate errors, support efficiency,
and strategically use design cues so that cognitive resources remain focused on learning.
UX describes the broader context of usage in terms of “a person’s perceptions and
responses that result from the use or anticipated use of a product, system, or service”
(International Organization for Standardization [ISO], 2010, Terms and Definitions section,
para 2.15). Emphasizing HCI corresponds with a more user-centered approach to design.
Such user-centered design (UCD) emphasizes understanding users’ needs and expectations
throughout all phases of design (Norman, 1986).
The principles of HCI and UCD have implications for the design of learning environments.
While the LIDT field has historically focused on learning theories to guide design (e.g.,
scaffolding, sociocultural theory, etc.), less emphasis has been placed on HCI and UCD
(OkumuÅŸ, Lewis, Wiebe, & Hollebrands, 2016). This chapter attempts to address this
issue. We begin with a discussion of the learning theories that are foundational to HCI. We
then discuss the importance of iteration in the design cycles. We conclude with details of
UCD-specific methodologies that allow the designer to approach design from both a
                                            384
               Foundations of Learning and Instructional Design Technology
Theoretical Foundations
Usability and HCI are closely related with established learning theories such as cognitive
load theory, distributed cognition, and activity theory. In the following sections, we discuss
each theory and how it is important for conceptualizing usability and UX from an
instructional design perspective.
Cognitive load theory. Cognitive load theory (CLT) contends that meaningful learning is
predicated on effective cognitive processing; however, an individual only has a limited
number of resources needed to process the information (Mayer & Moreno, 2003; Paas &
Ayres, 2014). The three categories of CLT include: (1) intrinsic load, (2) extraneous load,
and (3) germane load (Sweller, van Merriënboer, & Paas, 1998). Intrinsic load describes the
active processing or holding of verbal and visual representations within working memory.
Extraneous load includes the elements that are not essential for learning, but are still
present for learners to process (Korbach, Brünken, & Park, 2017). Germane load describes
the relevant load imposed by the effective instructional design of learning materials.
Germane cognitive load is relevant to schema construction in long-term memory (Paas,
Renkl, & Sweller., 2003; Sweller et al., 1998; van Merriënboer & Ayres, 2005). It is
important to note that the elements of CLT are additive, meaning that if learning is to occur,
the total load cannot exceed available working memory resources (Paas et al., 2003).
Extraneous load is of particular importance for HCI and usability. Extraneous cognitive load
can be directly manipulated by the designer (van Merriënboer & Ayres, 2005). When an
interface is not designed with usability in mind, the extraneous cognitive load is increased,
which impedes meaningful learning. From an interface design perspective, poor usability
might result in extraneous cognitive load in many forms. For instance, a poor navigation
structure might require the learner to extend extra effort to click through an interface to
find relevant information. Further, when the interface uses unfamiliar terms that do not
align with a user’s mental models or the interface is not consistently designed, the user
must exert additional effort toward understanding the interface. Another example of
extraneous cognitive load is when a learner does not know how to proceed, so the learner is
taken out of their learning flow. Although there are many other examples, each depicts how
poor usability taxes cognitive resources. After extraneous cognitive load is controlled for,
then mental resources can be shifted to focus on germane cognitive load for building
schemas (Sweller et al., 1998).
Distributed cognition and activity theory. While cognitive load theory helps describe the
individual interaction of a user experience, other theories and models focus on broader
conceptualizations of HCI. The most prominent ones include distributed cognition and
activity theory, which take into account the broader context of learning and introduce the
role of collaboration between various individuals. Distributed cognition postulates that
knowledge is present both within the mind of an individual and across artifacts (Hollan,
                                             385
               Foundations of Learning and Instructional Design Technology
Hutchins, & Kirsh, 2000). The theory emphasizes “understanding the coordination among
individuals and artifacts, that is, to understand how individual agents align and share within
a distributed process” (Nardi, 1996, p. 39). From a learning technology perspective, tools
are deemed important because they help facilitate cognition through communication across
various entities; that is, technology facilitates the flow of knowledge in pursuit of a goal
(Boland, Tenkasi, & Te’eni, 1994; Vasiliou, Ioannou, & Zaphiris, 2014). In doing so, the unit
of analysis is focused on the function of the system within the broader context (Michaelian &
Sutton, 2013). Therefore, user experience is defined as much broader and more
collaborative when compared with cognitive load theory.
Activity theory is a similar framework to distributed cognition, but focuses on the activity
and specific roles within an interconnected system. Activity theory describes workgroup
behavior in terms of a goal-directed hierarchy: activities, actions, and operations (Jonassen
& Rohrer-Murphy, 1999). Activities describe the top-level objectives and fulfillment of
motives (Kaptelinin, Nardi, & Macaulay, 1999). Within a learning context, these are often
technology implementations that subgroups must embrace. An example is the integration of
a new LMS or new training approach. Actions are the more specific goal-directed processes
and smaller tasks that must be completed in order to complete the overarching activity.
Operations describe the automatic cognitive processes that group members complete
(Engeström, 2000). However, they do not maintain their own goals, but are rather the
unconscious adjustment of actions to the situation at hand (Kaptelinin et al., 1999). In terms
of HCI, an implemented technology will be designed to support learning contexts on any or
all of these levels for a given objective.
                                             386
               Foundations of Learning and Instructional Design Technology
                                             387
               Foundations of Learning and Instructional Design Technology
Acronyms
User-centered Design
Given the theoretical implications of usability and the design of learning environments, the
question arises as to how one designs and develops highly usable learning environments.
The field of instructional design (ID) has recently begun to shift its focus to more iterative
design and user-driven development models, and a number of existing instructional design
methods can be used or adapted to fit iterative approaches. Identifying learning needs has
long been the focus of front-end analysis. Ideation and prototyping are frequently used
methods from UX design and rapid prototyping. Testing in instructional design has a rich
history in the form of formative and summative evaluation. By applying these specific
methods within iterative processes, instructional designers can advance their designs in
such a way that they can focus not only on intended learning outcomes but also on the
usability of their designs. In the following sections, UCD is considered with a specific focus
on techniques for incorporating UCD into one’s instructional design processes through (1)
identifying user needs, (2) requirements gathering, (3) prototyping, and (4) wireframing.
Similarly to the field of instructional design, in which the design process begins with
assessing learner needs (Sleezer, Russ-Eft, & Gupta, 2014), UCD processes also begin by
identifying user needs. The focus of needs assessment in instructional design often is
identification of a gap (the need) between actual performance and optimal performance
(Rossett, 1987; Rossett & Sheldon, 2001). Needs and performance can then be further
analyzed and instructional interventions designed to address those needs. Assessing user
(and learner) needs can yield important information about performance gaps and other
problems. However, knowledge of needs alone is insufficient to design highly usable
learning environments. Once needs have been identified, the first phase of the UCD process
centers around determining the specific context of use for a given artifact. Context is
defined by users (who will use the artifact), tasks (what will users do with the artifact), and
environment (the local context in which users use the artifact). A variety of methods are
used to gain insight into these areas.
                                              388
               Foundations of Learning and Instructional Design Technology
whose characteristics represent a specific user group. They serve as a methodological tool
that helps designers approach design based on the perspective of the user rather than
(often biased) assumptions. A persona typically includes information about user
demographics, goals, needs, typical day, and experiences. In order to create a persona,
interviews or observations should take place to gather information from individual users and
then place them into specific user categories. Personas should be updated if there are
changes to technology, business needs, or other factors. Personas help designers obtain a
deep understanding of the types of users for the system. Because personas are developed
based on data that have been gathered about users, bias is reduced. An effective way to
start creating personas is to use a template; a simple web search will yield many. Table 1
provides an example of a persona that was created by novice designers in an introductory
instructional design course using a template.
User Goals: What users are trying to achieve by using your site, such as tasks they want to
perform
1. Parents seek advice on improving teacher/parent interactions
2. Parents seek to build and foster a positive partnership between teacher and parents to
contribute to child’s school success
3. Parents wish to find new ways or improve ways of parent-teacher communication
Behavior: Online and offline behavior patterns, helping to identify users’ goals
1. Online behavior: “Googling” ways to improve teacher communication with parent or
parent communication with teacher; parent searching parent/teacher communication sites
for types of technology to improve communication; navigating through site to reach
information
2. Offline behavior: Had ineffective or negative parent-teacher communication over
multiple occurrences; parents seeking out other parents for advice or teachers asking
colleagues for suggestions to improve communication with parents
3. Online/Offline behavior: Taking notes, practicing strategies or tips suggested, discussing
with a colleague or friend.
Attitudes: Relevant attitudes that predict how users will behave
1. Looking for answers
2. Reflective
3. Curiosity-driven
Motivations: Why users want to achieve these goals
1. Wishing to avoid past unpleasant experiences of dealing with parent-teacher interaction
2. Looking to improve current or future parent-teacher relationships
3. Looking to avoid negative perceptions of their child by teacher
Design team objectives: What you ideally want users to accomplish in order to ensure
your website is successful?
1. Have an interface that is easy to navigate
2. Inclusion of both parent and teacher in the page (no portal/splash page)
3. Grab interest and engage users to continue reading and exploring the site
                                            389
               Foundations of Learning and Instructional Design Technology
Identifying Requirements
One potential pitfall of design is when developers create systems based on assumptions of
what users want. After designers have begun to understand the user, they begin to identify
what capabilities or conditions a system must be able to support to meet the identified user
needs. These capabilities or conditions are known as “requirements.” The process a
designer undertakes to identify these requirements is known as “requirements gathering.”
Generally, requirements gathering involves: (1) gathering user data (e.g., user surveys,
focus groups, interviews, etc.), (2) data analysis, and (3) interpretation of user needs. Based
on interpretation of user needs, a set of requirements is generated to define what system
capabilities must be developed to meet those needs. Requirements are not just obtained for
one set of users, but for all user-types and personas that might utilize the system.
Rapid prototyping. Rapid prototyping is an approach to design that emerged in the 1980s
in engineering fields and began to gain traction in ID in the early 1990s (Desrosier, 2011;
Tripp & Bichelmeyer, 1990; Wilson, Jonassen, & Cole, 1993). Instead of traditional ID
approaches with lengthy design and development phases, rapid prototyping focuses on fast,
or “rapid,” iterations. This allows instructional designers to quickly gather evaluative
feedback on their early designs. Considered a feedback-driven approach to ID, rapid
prototyping is seen by many as a powerful tool for the early stages of an ID project. The
rapid prototyping approach relies on multiple, rapid cycles in which an artifact is designed,
                                              390
               Foundations of Learning and Instructional Design Technology
developed, tested, and revised. Actual users of the system participate during the testing
phase. This cycle repeats until the artifact is deemed to be acceptable to users. An example
of rapid prototyping applied in an instructional design context is the successive
approximation model, or SAM (Allen, 2014). The SAM (version 2) process model is provided
in Figure 2.
                                             391
               Foundations of Learning and Instructional Design Technology
                                                                                 Fi
    gure 3. Example of a paper prototype that has been scanned and annotated using
    digital tools.
                                             392
               Foundations of Learning and Instructional Design Technology
During evaluation, functional prototypes allow for a user to experience a mockup interface
                                            393
                Foundations of Learning and Instructional Design Technology
in a way that is very similar to the experience of using an actual interface. However,
because functionality is limited, development time can be reduced substantially. Functional
prototypes provide a powerful way to generate feedback from users in later stages of the
design process, allowing for tweaks and refinements to be incorporated before time and
effort are expended on development.
To reiterate, the goal of UCD is to approach systems development from the perspective of
the end-user. Through tools such as personas and prototypes, the design process becomes
iterative and dynamic. Learning designers also use these tools in conjunction with
evaluation methods to better align prototype interface designs with users mental models,
thereby reducing cognitive load and improving usability.
                                              394
               Foundations of Learning and Instructional Design Technology
Ethnography. A method that is used early in the front-end analysis phase, especially for
requirements gathering, is ethnography. Ethnography is a qualitative research method in
which a researcher studies people in their native setting (not in a lab or controlled setting).
During data collection, the researcher observes the group, gathers artifacts, records notes,
and performs interviews. In this phase, the researcher is focused on unobtrusive
observations to fully understand the phenomenon in situ. For example, in an ethnographic
interview, the researcher might ask open-ended questions but would ensure that the
questions were not leading. The researcher would note the difference between what the
user is doing versus what the user is saying and take care not introduce his or her own bias.
Although this method has its roots in the field of cultural anthropology, ethnography in UX
design can support thinking about design from activity theory and distributed cognition
perspectives (Nardi, 1996). It is useful in UX evaluations because the researcher can gather
information about the users, their work environment, their culture, and how they interact
with the device or website in context (Nardi, 1997). This information is particularly valuable
when writing user personas. Ethnography is also useful if the researcher cannot conduct
user testing on systems or larger equipment due to size or security restrictions.
                                             395
               Foundations of Learning and Instructional Design Technology
Focus groups. Focus groups are often used during the front-end analysis phase. Rather
than the researcher going into the field to study a larger group as in ethnography, a small
group of participants (5-10) are recruited based on shared characteristics. Focus group
sessions are led by a skilled moderator who has a semi-structured set of questions or plan.
For instance, a moderator might ask what challenges a user faces in a work context (i.e.,
actuals vs. optimals gap), suggestions for how to resolve it, and feedback on present
technologies. The participants are then asked to discuss their thoughts on products or
concepts. The moderator may also present a paper prototype and ask for feedback. The role
of the researcher in a focus group is to ensure that no single person dominates the
conversation in order to hear everyone’s opinions, preferences, and reactions. This helps to
determine what users want and keeps the conversation on track. It is preferred to have
multiple focus group sessions to ensure various perspectives are heard in case a
conversation gets side-tracked.
Analyzing data from a focus group can be as simple as providing a short summary with a
few illustrative quotes for each session. The length of the sessions (typically 1-2 hours) may
include some extraneous information, so it is best to keep the report simple.
Card sorting. Aligning designs with users mental models is important for effective UX
design. A method used to achieve this is card sorting. Card sorting is used during front-end
analysis and paper prototyping. Card sorting is commonly used in psychology to identify
how people organize and categorize information (Hudson, 2012). In the early 1980s, card
sorting was applied to organizing menuing systems (Tullis, 1985) and information spaces
(Nielsen & Sano, 1995).
Card sorting can be conducted physically using tools like index cards and sticky notes or
electronically using tools like Lloyd Rieber’s Q Sort [https://edtechbooks.org/-PTS]
(http://lrieber.coe.uga.edu/qsort/index.html). It can involve a single participant or a group of
participants. With a single participant, he or she groups content (individual index cards) into
categories, allowing the researcher to evaluate the information architecture or navigation
structure of a website. For example, a participant might organize “Phone Number” and
“Address” cards together. When a set of cards is placed together by multiple participants,
this suggests to the designer distinct pages that can be created (e.g., “Contact Us”). When
focusing on a group, the same method is employed, but the group negotiates how they will
group content into categories. How participants arrange cards provides insight into mental
models and how they group content.
In an open card sort, a participant will first group content (menu labels on separate
notecards) into piles and then name the category. Participants can also place the notecards
in an “I don’t know” pile if the menu label is not clear or may not belong to a designated pile
of cards. In a closed card sort, the categories will be pre-defined by the researcher. It is
recommended to start with an open card sort and then follow-up with a closed card sort
(Wood & Wood, 2008). As the arrangement of participants are compared, the designer
iterates the early prototypes so the menu information and other features align with how the
                                              396
               Foundations of Learning and Instructional Design Technology
participants organize the information within their mind. For card sorting best practices,
refer to Righi et al (2013) [https://edtechbooks.org/-hEK] article.
Cognitive walkthroughs. Cognitive walkthroughs (CW) can be used during all prototyping
phases. CW is a hands-on inspection method in which an evaluator (not a user) evaluates the
interface by walking through a series of realistic tasks (Lewis & Wharton, 1997). CW is not a
user test based on data from users, but instead is based on the evaluator’s judgments.
During a CW, the evaluator evaluates specific tasks and considers the user’s mental
processes while completing those tasks. For example, an evaluator might be given the
following task: Recently you have been experiencing a technical problem with software on
your laptop and you have been unable to find a solution to your problem online. Locate the
place where you would go to send a request for assistance to the Customer Service Center.
The evaluator identifies the correct paths to complete the task, but does not make a
prediction as to what a user will actually do. In order to assist designers, the evaluator also
provides reasons for making errors (Wharton, Rieman, Lewis, & Polson, 1994). The
feedback received during the course of the CW provides insight into various aspects of the
user experience including:
      how easy it is for the user to determine the correct course of action,
      whether the organization of the tools or functions matches the ways that users think
      of their work,
      how well the application flow matches user expectations,
      whether the terminology used in the application is familiar to users, and
      whether all data needed for a task is present on screen.
For information on how to conduct a CW, view the Interaction Design Foundation’s article
[https://edtechbooks.org/-Xc], available at https://www.interaction-design.org.
Heuristic evaluation. Heuristic evaluation is an inspection method that does not involve
directly working with the user. In a heuristic evaluation, usability experts work
independently to review the design of an interface against a pre-determined set of usability
principles (heuristics) before communicating their findings. Ideally, each usability expert
will work through the interface at least twice: once for an overview of the interface and the
second time to focus on specific interface elements (Nielsen, 1994). The experts then meet
and reconcile their findings. This method can be used during any phase of the prototyping
cycle.
Many heuristic lists exist that are commonly used in heuristic testing. The most well-known
heuristic checklist was developed over 25 years ago by Jakob Nielsen and Rolf Molich
(1990). This list was later simplified and reduced to 10 heuristics which were derived from
249 identified usability problems (Nielsen, 1994). In the field of instructional design, others
have embraced and extended Nielsen’s 10 heuristics to make them more applicable to the
evaluation of eLearning systems (Mehlenbacher et al., 2005; Reeves et al., 2002). Not all
heuristics are applicable in all evaluation scenarios, so UX designers tend to pull from
                                              397
                Foundations of Learning and Instructional Design Technology
existing lists to create customized heuristic lists that are most applicable and appropriate to
their local context.
Nielsen’s 10 Heuristics
An approach that bears similarities with a heuristic review is the expert review. This
approach is similar in that an expert usability evaluator reviews a prototype but differs in
that the expert does not use a set of heuristics. The review is less formal and the expert
typically refers to personas to become informed about the users. Regardless of whether
heuristic or expert review is selected as an evaluation method, data from a single expert
evaluator is insufficient for making design inferences. Multiple experts should be involved,
and data from all experts should be aggregated. Different experts will have different
perspectives and will uncover different issues. This helps ensure that problems are not
overlooked.
A/B testing. A/B testing or split-testing compares two versions of a user interface and,
because of this, all three prototyping phases can employ this method. The different interface
versions might vary individual screen elements (such as the color or size of a button),
typeface used, placement of a text box, or overall general layout. During A/B testing, it is
important that the two versions are tested at the same time by the same user. For instance,
Version A can be a control and Version B should only have one variable that is different
(e.g., navigation structure). A randomized assignment, in which some participants receive
Version A first and then Version B (versus receiving Version B and then Version A), should
be used.
Think-aloud user study. Unlike A/B testing, a think-aloud user study is only used during
the functional prototyping phase. According to Jakob Nielsen (1993), “thinking aloud may be
the single most valuable usability engineering method” (p. 195). In a think-aloud user study,
a single participant is tested at any given time. The participant narrates what he or she is
doing, feeling, and thinking while looking at a prototype (or fully functional system) or
                                             398
               Foundations of Learning and Instructional Design Technology
completing a task. This method can seem unnatural for participants, so it is important for
the researcher to encourage the participant to continue verbalizing throughout a study
session. To view an example of a think-aloud user study, please watch Steve Krug’s “Rocket
Surgery Made Easy” video [https://edtechbooks.org/-zhU].
A great deal of valuable data can come from a think-aloud user study (Krug, 2010).
Sometimes participants will mention things they liked or disliked about a user interface.
This is important to capture because it may not be discovered in other methods. However,
the researcher needs to also be cautious about changing an interface based on a single
comment.
Users do not necessarily have to think aloud while they are using the system. The
retrospective think aloud is an alternative approach that allows a participant to review the
recorded testing session and talk to the researcher about what he or she was thinking
during the process. This approach can provide additional helpful information, although it
may be difficult for some participants to remember what they were thinking after some
time. Hence, it is important to conduct retrospective think aloud user testing as soon after a
recorded testing session as possible.
                                             399
          Foundations of Learning and Instructional Design Technology
Figure 5. Heat map of an interface in which users with autism must identify facial
expressions; here, eye fixations are shown with red indicating longer dwell time
and blue indicating shorter dwell time. Adapted from “3D Virtual Worlds: Assessing
the Experience and Informing Design,” by S. Goggins, M. Schmidt, J. Guajardo, and
J. Moore, 2011, International Journal of Social and Organizational Dynamics in
Information Technology, 1(1), p. 41. Reprinted with permission.
                                       400
          Foundations of Learning and Instructional Design Technology
                                       401
               Foundations of Learning and Instructional Design Technology
    Figure 7. Gaze plot of an interface in which users with autism must identify facial
    expressions; here, markers plot gaze location every 5 milliseconds. Adapted from
    “3D Virtual Worlds: Assessing the Experience and Informing Design,” by S.
    Goggins, M. Schmidt, J. Guajardo, and J. Moore, 2011, International Journal of
    Social and Organizational Dynamics in Information Technology, 1(1), p. 39.
    Reprinted with permission.
This type of user testing serves as a way to understand when users find something
important or distracting, thereby informing designers of extraneous cognitive load. A
disadvantage of this type of data is that it might not be clear why a user was fixated on a
particular element on the screen. This is a situation in which a retrospective think-aloud can
be beneficial. After the eye-tracking data have been collected, the researcher can sit down
with the user and review the eye-tracking data while asking about eye movements and
particular focus areas.
                                             402
               Foundations of Learning and Instructional Design Technology
captured while browsing the web or using a software application (see Figure 8). This
information can be beneficial because it can show the researcher the path the participant
was taking while navigating a system. Typically, these data need to be triangulated with
other data sources to paint a broader picture.
Conclusion
As digital tools have gained in popularity, there is a rich body of literature that has focused
on interface design. Indeed, a variety of principles and theories (e.g. cognitive load theory,
distributed cognition, activity theory, etc.) have provided valuable insight about the design
process. While the design of learning technologies is not new, issues of how users interact
with the technology can sometimes become secondary to pedagogical concerns. In this
chapter, we have illustrated how the field of HCI intersects with the field of instructional
design and explored how to approach interface design from the perspectives of usability,
UX, and UCD. Moreover, we have provided examples of iterative design techniques and
evaluation methodologies that can be employed to advance usable designs. The concepts of
HCI, UX, and UCD provide insight into how learning technologies are used by educators and
learners. A design approach approach that balances these principles and learning theories
helps ensure that digital tools are designed in a way that best supports learning.
                                             403
               Foundations of Learning and Instructional Design Technology
References
Ackerman, M. S. (2000). The intellectual challenge of CSCW: The gap between social
requirements and technical feasibility. Human-Computer Interaction, 15, 179–203.
Allen, M. (2014). Leaving ADDIE for SAM: An agile model for developing the best learning
experiences. American Society for Training and Development: Alexandria, VA.
Barab, S., Barnett, M., Yamagata-Lynch, L., Squire, K., & Keating, T. (2002). Using activity
theory to understand the systemic tensions characterizing a technology-rich introductory
astronomy course. Mind, Culture, and Activity, 9(2), 76-107.
Boland, R. J., Tenkasi, R. V., & Te’eni, D. (1994). Designing information technology to
support distributed cognition. Organization Science, 5(3), 456–475.
Cooper, A. (2004). The inmates are running the asylum: Why high-tech products drive us
crazy and how to restore the sanity. Indianapolis, IN: Sams Publishing.
Engeström, Y. (2000). Activity theory as a framework for analyzing and redesigning work.
Ergonomics, 43(7), 960–974.
Goggins, S., Schmidt, M., Guajardo, J., & Moore, J. (2011). 3D virtual worlds: Assessing the
experience and informing design. International Journal of Social and Organizational
Dynamics in Information Technology, 1(1), 30-48.
Hollan, J., Hutchins, E., & Kirsh, D. (2000). Distributed cognition: toward a new foundation
for human-computer interaction research. ACM Transactions on Computer Human
Interaction, 7(2), 174–196.
Hudson, W. (2012). Card sorting. In M. Soegaard, & R. F. Dam (Eds.), The encyclopedia of
human-computer interaction (2nd ed.). Retrieved from
https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-int
eraction-2nd-ed/card-sorting
Jonassen, D., & Rohrer-Murphy, L. (1999). Activity theory as a framework for designing
constructivist learning environments. Educational Technology Research and Development,
47(1), 61–79.
                                             404
               Foundations of Learning and Instructional Design Technology
Jou, M., Tennyson, R. D., Wang, J., & Huang, S.-Y. (2016). A study on the usability of E-books
and APP in engineering courses: A case study on mechanical drawing. Computers &
Education, 92(Supplement C), 181–193.
Kaptelinin, V., Nardi, B., & Macaulay, C. (1999). The activity checklist: a tool for
representing the “space” of context. Interactions, 6(4), 27–39.
Korbach, A., Brünken, R., & Park, B. (2017). Measurement of cognitive load in multimedia
learning: a comparison of different objective measures. Instructional Science, 45(4),
515–536.
Krug, S. (2010). Rocket surgery made easy: The do-it-yourself guide to finding and fixing
usability problems. New Riders: Berkeley, CA.
Lewis, C., & Wharton, C. (1997). Cognitive walkthroughs. In M. Helander, T. K. Landauer, &
P. Prabhu (Eds.), Handbook of human-computer interaction (2nd ed., pp. 717–732).
Amsterdam: Elsevier.
Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia
learning. Educational Psychologist, 38(1), 43–52.
Mehlenbacher, B., Bennett, L., Bird, T., Ivey, I., Lucas, J., Morton, J., & Whitman, L. (2005).
Usable e-learning: A conceptual model for evaluation and design. Proceedings of HCI
International 2005: 11th International Conference on Human-Computer Interaction, Volume
4 — Theories, Models, and Processes in HCI. Las Vegas, NV.
Michaelian, K., & Sutton, J. (2013). Distributed cognition and memory research: History and
current directions. Review of Philosophy and Psychology, 4(1), 1–24.
methods (pp. 25-62). New York, NY: John Wiley & Sons.
                                              405
                Foundations of Learning and Instructional Design Technology
https://www.nngroup.com/articles/usability-101-introduction-to-usability/
Nielsen, J., & Molich, R. (1990). Heuristics evaluation of user interfaces. Proceedings of
ACM CHI’90 Conference. Seattle, WA.
Nielsen, J., & Sano, D. (1995). SunWeb: user interface design for Sun Microsystem’s
internal Web. Computer Networks and ISDN Systems, 28(1), 179-188.
OkumuÅŸ, S., Lewis, L., Wiebe, E., & Hollebrands, K. (2016). Utility and usability as factors
influencing teacher decisions about software integration. Educational Technology Research
and Development, 64(6), 1227–1249.
Paas, F., & Ayres, P. (2014). Cognitive load theory: A broader view on the role of memory in
learning and education. Educational Psychology Review, 26(2), 191–195.
Paas, F., & Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design:
Recent developments. Educational Psychologist, 38, 1-4.
Reeves, T. C., Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., . . . Loh, S. (2002).
Usability and instructional design heuristics for e-learning evaluation. Proceedings of the
World Conference on Educational Multimedia, Hypermedia & Telecommunications. Denver,
CO.
Righi, C., James, J., Beasley, M., Day, D. L., Fox, J. E., Gieber, J., . . . Ruby, L. (2013). Card
sort analysis best practices. Journal of Usability Studies, 8(3), 69-89. Retrieved from
http://uxpajournal.org/card-sortanalysis-best-practices-2/
Rodríguez, G., Pérez, J., Cueva, S., & Torres, R. (2017). A framework for improving web
accessibility and usability of Open Course Ware sites. Computers & Education,
109(Supplement C), 197–215.
Rogers, Y. (2012). HCI theory: Classical, modern, and contemporary. Synthesis Lectures on
Human-Centered Informatics, 5(2), 1-129. doi: 10.2200/S00418ED1V01Y201205HCI014
Romano Bergstrom, J. C., Duda, S., Hawkins, D., & McGill, M. (2014). Physiological
response measurements. In J. Romano Bergstrom & A. Schall (Eds.), Eye tracking in user
experience design (pp. 81-110). San Francisco, CA: Morgan Kaufmann.
Rossett, A., & Sheldon, K. (2001). Beyond the podium: Delivering training and performance
                                                406
               Foundations of Learning and Instructional Design Technology
Schmidt, M., Kevan, J., McKimmy, P., & Fabel, S. (2013). The best way to predict the future
is to create it: Introducing the Holodeck mixed-reality teaching and learning environment.
Proceedings of the 2013 International Convention of the Association for Educational
Communications and Technology, Anaheim, CA.
Schmidt, M. & Tawfik, A. (in press). Transforming a problem-based case library through
learning analytics and gaming principles: An educational design research approach.
Interdisciplinary Journal of Problem-Based Learning.
Sleezer, C. M., Russ-Eft, D. F., & Gupta, K. (2014). A practical guide to needs assessment
(3rd ed.). San Francisco, CA: Pfeiffer.
Snyder, C. (2003). Paper prototyping: The fast and easy way to design and refine user
interfaces. San Francisco, CA: Morgan Kaufmann.
Straub, E. T. (2017). Understanding technology adoption: Theory and future directions for
informal learning. Review of Educational Research, 79(2), 625–649.
Sweller, J., van Merriënboer, J. J. G., & Paas, F. G. W. C. (1998). Cognitive architecture and
instructional design. Educational Psychology Review, 10, 251-296.
van Merriënboer, J. J. G., & Ayres, P. (2005). Research on cognitive load theory and its
design implications for e-learning. Educational Technology Research and Development,
53(3), 5-13.
Vasiliou, C., Ioannou, A., & Zaphiris, P. (2014). Understanding collaborative learning
activities in an information ecology: A distributed cognition account. Computers in Human
Behavior, 41(Supplement C), 544–553.
Walker, M., Takayama, L. & Landay, J.A. (2002). High-fidelity or low-fidelity, paper or
computer? Choosing attributes when testing web prototypes. Proceedings of the Human
Factors and Ergonomics Society 46th Annual Meeting, Baltimore, MD.
Wharton, C., Rieman, J., Lewis, C., & Polson, P. (1994). The cognitive walkthrough method:
                                             407
               Foundations of Learning and Instructional Design Technology
A practitioner’s guide. In J. Nielsen & R. L. Mack (Eds.), Usability inspection methods (pp.
105-140). New York, NY: John Wiley & Sons.
Wilson, B. G., Jonassen, D. H., & Cole, P. (1993). Cognitive approaches to instructional
design. In G. M. Piskurich (Ed.), The ASTD handbook of instructional technology (pp.
11.1-21.22). New York: McGraw-Hill.
Wood, J. R., & Wood, L. E. (2008). Card sorting: Current practice and beyond. Journal of
Usability Studies, 4(1), 1-6.
Suggested Citation
    Earnshaw, Y., Tawfik, A. A., & Schmidt, M. (2018). User Experience Design. In R.
    E. West, Foundations of Learning and Instructional Design Technology: The Past,
    Present, and Future of Learning and Instructional Design Technology. EdTech
    Books. Retrieved from
    https://edtechbooks.org/lidtfoundations/user_experience_design
                                             408
                          Yvonne Earnshaw
                                       409
                         Andrew A. Tawfik
                                      410
                         Matthew Schmidt
                                      411
                 IV. Technology and Media
Technology and media has always been central to the field of LIDT. While many in the field
have argued that technology can represent any tool, even conceptual ones, this section will
discuss the more popular digital technologies. You will read about current research trends
in technology integration, and frameworks for understanding what effective technology
integration is. There is also a summary of the essential topic of distance education, and an
older, but classic, cautionary article about how much of the research into distance education
(and any new educational technology) falls into the classic "media comparison" trap that has
plagued our field since the classic Clark versus Kozma debates in Educational Technology
Research and Development and Review of Educational Research in the 1980s and 1990s
(see this summary [https://edtechbooks.org/-HMN]). A few of the many current trends are
also represented in this section, with articles on open educational resources, gamified
learning, data mining, learning analytics, and open badges. While reading these articles, I
refer you back to Andrew Gibbons' article in the design section, where he defines the
various "centrisms" he has observed in our field. While it is common for many students to
begin their careers media-centric, as you develop wisdom and expertise, you should come to
see technology and media as a means, instead of an end, to your instructional design goals.
                                            412
                                            31
Editor’s Note
Abstract
It is commonly believed that learning is enhanced through the use of technology and that
students need to develop technology skills in order to be productive members of society. For
this reason, providing a high-quality education includes the expectation that teachers use
educational technologies effectively in their classroom and that they teach their students to
use technology. In this chapter, we have organized our review of technology integration
research around a framework based on three areas of focus: (1) increasing access to
educational technologies, (2) increasing the use of technology for instructional purposes,
and (3) improving the effectiveness of technology use to facilitate learning. Within these
categories, we describe findings related to one-to-one computing initiatives, integration of
open educational resources, various methods of teacher professional development, ethical
issues affecting technology use, emerging approaches to technology integration that
emphasize pedagogical perspectives and personalized instruction, technology-enabled
assessment practices, and the need for systemic educational change to fully realize
technology’s potential for improving learning. From our analysis of the scholarship in this
area, we conclude that the primary benefit of current technology use in education has been
to increase information access and communication. Students primarily use technology to
                                             413
               Foundations of Learning and Instructional Design Technology
gather, organize, analyze, and report information, but this has not dramatically improved
student performance on standardized tests. These findings lead to the conclusion that future
efforts should focus on providing students and teachers with increased access to technology
along with training in pedagogically sound best practices, including more advanced
approaches for technology-based assessment and adaptive instruction.
Introduction
The Elementary and Secondary Education Act of 2001 mandated an emphasis on technology
integration in all areas of K–12 education, from reading and mathematics to science and
special education (U.S. Department of Education, 2002). This mandate was reinforced in the
U.S. Department of Education’s (2010) National Education Technology Plan. Under current
legislation, education leaders at the state and local levels are expected to develop plans to
effectively utilize educational technologies in the classroom. The primary goal of federal
education legislation is to improve student academic achievement, measured primarily by
student performance on state standardized tests. Secondary goals include the expectation
that every student become technologically literate, that research-based technology-
enhanced instructional methods and best practices be established, and that teachers be
encouraged and trained to effectively integrate technology into the instruction they provide.
The directive to integrate instructional technology into the teaching and learning equation
results from the following fundamental beliefs: (1) that learning can be enhanced through
the use of technology and (2) that students need to develop technology skills in order to
become productive members of society in a competitive global economy (McMillan-Culp,
Honey, & Mandinach, 2005; U.S. Department of Education, 2010).
By most measures, the quality and availability of educational technology in schools, along
with the technological literacy of teachers and students, have increased significantly in the
past decade (Center for Digital Education, 2008; Gray, Thomas, & Lewis, 2010; McMillan-
Culp, Honey, & Mandinach, 2005; Nagel, 2010; Russell, Bebell, O’Dwyer, & O’Connor,
2003). In addition, educators are generally committed to technology use. Most educational
practitioners value technology to some degree, yet many researchers and policy analysts
have suggested that technology is not being used to its full advantage (Bauer & Kenton,
2005; Ertmer & Ottenbreit-Leftwich, 2010; Overbaugh & Lu, 2008; Woolfe, 2010). Even at
technology-rich schools, effective integration of technology into the instructional process is
rare (Shapley, Sheehan, Maloney, & Caranikas-Walker, 2010). To fully understand this
criticism requires in-depth consideration of the goals and criteria used for evaluating
technology integration.
Most efforts to integrate technology into schools have the stated goal of appropriate and
effective use of technology (Center for Digital Education, 2008; International Society for
Technology in Education [ISTE], 2008; Niederhauser & Lindstrom, 2006; Richey, Silber, &
Ely, 2008); however, many current efforts have focused predominantly on gaining access to
and increasing the extent of technology use. For example, in 1995 Moersch provided an
extremely useful framework describing levels of technology integration—a tool which is still
                                             414
               Foundations of Learning and Instructional Design Technology
being used (see http://loticonnection.com). Like other indicators, the Levels of Teaching
Innovation (LoTi) Framework tends to rely on access to and pervasive innovative use of
instructional technology as an indicator of the highest level of technology integration and
literacy. To some degree frameworks of this type assume that using technology will in itself
be beneficial and effective. Clearly, effective and appropriate use of technology does not
happen if students do not have access to learning technologies and do not use them for
educational purposes; however, pervasive technology use does not always mean that
technology is being used effectively or appropriately, nor does pervasive use of technology
necessarily lead to increased learning. The field of adaptive technologies is one area where
educational technology holds much promise. It is widely believed that intelligent tutoring
systems could be used to enhance a teacher’s ability to teach and test students, but
advances in this area have failed to produce the same kinds of formative and diagnostic
feedback that teachers provide (Woolfe, 2010). As a result, recent efforts to identify
appropriate and effective uses for technology have focused more on the pedagogically sound
use of technology to accomplish specific learning objectives (see for example, Koehler &
Mishra, 2008).
To better orient our understanding and evaluation of technology integration efforts at both
classroom and individual levels, integration might best be viewed as progressive steps
toward effective use of technology for the purposes of improving instruction and enhancing
learning. The current status of technology integration efforts could then be evaluated by the
degree to which teachers and students (1) have access to educational technologies, (2) use
technology for instructional purposes, and (3) implement technology effectively to facilitate
learning (Davies, 2011). After first defining technology and technology integration, this
chapter uses this framework for understanding and evaluating current technology
integration efforts in schools, along with the challenges associated with technology
integration.
                                            415
               Foundations of Learning and Instructional Design Technology
However, 60% of teachers providing data for this report also indicated that they and their
students did not often use computers in the classroom during instructional time. In fact,
29% of the teacher respondents reporting daily access to one or more computers also
reported that they rarely or never used computers for instructional purposes. A study
conducted by Shapley, Sheehan, Maloney, & Caranikas-Walker (2010) suggested that
teachers most frequently use the computer technology they had for administrative purposes
(e.g., record keeping), personal productivity (e.g., locating and creating resources), and
communicating with staff and parents. Students’ use of technology most often for
information gathering (i.e., internet searches) or for completing tasks more efficiently by
using a specific technology (e.g., word processing, cloud-based computing) (Bebell & Kay,
2010; Davies, Sprague, & New, 2008; Stucker, 2005).
Thus while the availability of technology in schools may have increased in recent years,
measures of access likely provide an overoptimistic indicator of technology integration. In
fact, some feel that for a variety of reasons the current level of technology access in schools
is far too uneven and generally inadequate to make much of an impact (Bebell & Kay, 2010;
Toch & Tyre, 2010). While some question the wisdom and value of doing so (Cuban, 2006b;
Warschauer & Ames, 2010), many believe we must strengthen our commitment to
improving access to technology by making it an educational funding priority (O’Hanlon,
2009; Livingston, 2008).
                                              416
               Foundations of Learning and Instructional Design Technology
Evidence of academic impact that can be attributed to one-to-one computing initiatives has
been mixed. A few studies have provided evidence that infusing technology into the
classroom has closed the achievement gap and increased academic performance (Shapley,
Sheehan, Maloney, & Caranikas-Walker, 2010; Zucker & Light, 2009); however, Cuban
(2006b) reported that most studies have shown little academic benefit in these areas, and
Vigdor and Ladd (2010) suggested that providing ubiquitous computer access to all students
may actually widen the achievement gap.
Other studies have suggested that additional benefits derived from technology integration
might include increased access to information, increased motivation of students to complete
their studies, and better communication between teachers and students (Bebell & Kay,
2010; Zucker; 2005). However, such studies often referred to the “potential” technology has
for increasing learning, acknowledging that any scholastic benefit technology might produce
depends on factors other than simply having access to technology (Center for Digital
Education, 2008; McMillan-Culp, Honey, & Mandinach, 2005; Woolfe, 2010).
An important factor associated with access is the issue of educational resource availability:
i.e., having access to technological tools without access to the educational resources needed
to utilize those tools effectively. Much of the current work in this area has focused on
                                            417
               Foundations of Learning and Instructional Design Technology
The Open Educational Resource (OER) movement is a worldwide initiative providing free
educational resources intended to facilitate teaching and learning processes (Atkins, Brown,
& Hammond, 2007). A few examples of OER initiatives include the OpenCourseWare
Consortium (www.ocwconsortium.org), the Open Educational Resources Commons
(www.oercommons.org), and the Open Learning Initiative (oli.web.cmu.edu/openlearning),
along with Creative Commons (creativecommons.org), which provides the legal mechanism
for sharing resources. Since one of the largest impediments to technology integration has
been cost (Greaves & Hayes, 2008), some policy analysts have identified the need to provide
free educational resources as essential to the success of any technology integration
mandate; but this idea has been controversial because it means individuals must be willing
to create and provide quality educational resources without compensation. Wiley (2007) has
pointed out that as the OER movement is currently an altruistic endeavor with no proven
cost recovery mechanism, the real costs associated with producing, storing, and distributing
resources in a format that operates equally well across various hardware and operating
system platforms constitute a sustainability challenge for the OER movement. The topic of
open education is discussed more completely in another chapter of this handbook.
Much of the research on increasing technology use in schools has focused on training those
preparing to become teachers, although discussions regarding professional development for
current classroom teachers are becoming more common. Harris, Mishra, and Koehler
(2009) suggested that most professional development in technology for teachers uses one of
five models: (a) software-focused initiatives, (b) demonstrations of sample resources,
lessons, and projects, (c) technology-based educational reform efforts, (d)
structured/standardized professional development workshops or courses, or (e) technology-
focused teacher education courses. According to these authors, there is, as yet, very little
conclusive evidence that any of these models has been successful in substantially increasing
the effective use of technology as measured by increased learning outcomes. Research on
technology integration training for teachers has typically focused on either (a) the
effectiveness of the professional development training methods or (b) the desired objectives
of the professional development.
                                            418
               Foundations of Learning and Instructional Design Technology
Collaborative Environments
Other scholars have found that increasing collaboration among teachers learning to
integrate technology can improve professional development outcomes. In an article on
technology integration, MacDonald (2008) wrote that “to effect lasting educational change”
collaboration for teachers needs to be facilitated in “authentic teacher contexts” (p. 431).
Hur and Brush (2009) added that professional development needs to emphasize the ability
of teachers to share their emotions as well as knowledge. Most collaborative environments
typically only emphasize knowledge sharing when emotion sharing may be linked to
effective professional development.
Mentoring
Similar to research on teacher collaboration, some scholars have discussed the important
role of mentoring in helping teachers gain technology integration skills. Kopcha (2010)
described a systems approach to professional development emphasizing communities of
practice and shifting mentoring responsibilities throughout various stages of the technology
integration adoption process. Kopcha’s model was designed to reduce some of the costs
associated with teacher mentoring—a common criticism of the method. In addition, Gentry,
                                            419
               Foundations of Learning and Instructional Design Technology
Denton, and Kurtz (2008) found in their review of the literature on technology-based
mentoring that while these approaches were not highly used, technology can support
mentoring and improve teachers’ technology integration attitudes and practices. The
authors noted however that many of these effects were self-reported, and not substantiated
through direct observation, nor was there any evidence of subsequent effect on student
learning outcomes.
Because education is a human, and thus a moral, endeavor (Osguthorpe, Osguthorpe, Jacob,
& Davies, 2003), ethical issues frequently surface. Technology integration has caused major
shifts in administrative and pedagogical strategies, thus creating a need for new definitions
and ideas about ethical teaching and learning (Turner, 2005). Although some have
cautioned that ethical issues should be considered before implementing technology-based
assignments (Oliver, 2007), the pressure to increase access to and ubiquitous use of
technology has often outpaced the necessary development of policies and procedures for its
ethical use (Baum, 2005), creating challenges for administrators and teachers who are
integrating it in schools. In many cases unintended negative consequences and ethical
dilemmas have resulted from inappropriate use of technology, and addressing these issues
has required that restrictions be applied. Scholars have specifically mentioned the issues
related to technology-based academic dishonesty, the challenges of technology accessibility
for all students, and the necessity for developing standards for ethical technology use.
According to Akbulut et al. (2008), the most common examples of academic dishonesty
include fraudulence, plagiarism, falsification, delinquency, and unauthorized help. Lin
                                            420
                Foundations of Learning and Instructional Design Technology
(2007) adds copyright infringement and learner privacy issues to the list of unethical
behaviors. Many researchers have discussed the potential for technology to increase these
kinds of academic dishonesty and unethical behaviors. Of concern to many teachers is that
technology provides easy access to information, giving students more opportunities to cheat
(Akbulut, et al., 2008; Chiesl, 2007). King, Guyette, and Piotrowski (2009) found that the
vast majority of undergraduate business students in their study considered it easier to cheat
online than in a traditional classroom setting. Scholars also believed that the increasingly
social and collaborative nature of the Web creates a greater acceptance of cheating by
students (Ma, Lu, Turner, & Wan, 2007). Baum (2005) reported, “Many computer-savvy kids
as well as educators, administrators and parents are unclear about what is and what is not
ethical when dealing with the World Wide Web” (p. 54). Greater opportunities and relaxed
attitudes about cheating have led to issues of plagiarism, among other challenges (de Jagar
& Brown, 2010; Samuels & Blast, 2006). However, other research has contradicted these
conclusions, arguing that online learning does not necessarily facilitate greater dishonesty.
For example, Stuber-McEwen, Wiseley, and Hoggatt (2009) surveyed 225 students and
found that students enrolled in online classes were less likely to cheat than those in regular
classes, leaving the question of whether the online medium facilitates greater cheating still
unanswered.
Accessibility
Accessibility of educational technologies has been recognized as one of the most prominent
ethical concerns facing schools (Lin, 2007). In support of this notion, Garland (2010)
suggested that one of the school principal’s most important roles is ensuring ethical
technology use and guarding against inequities in technology access between groups of
students. However scholars are not consistent on how accessibility might be a problem.
Traxler (2010), for example, has suggested that unequal access to technology creates a
digital divide that can impede the social progress of some student groups, contributing to a
potential nightmare for institutions. In contrast, Vigdor & Ladd (2010) pointed out that
providing all students with ubiquitous access to educational technology would increase not
decrease the achievement gap. In addition to enabling all student groups to have access to
the same educational technologies, institutions must also increase access to assistive
technologies for students with disabilities (Dyal, Carpenter, & Wright, 2009).
A quick search of the internet using the keywords “appropriate technology use policy”
reveals a plethora of documents from schools stipulating the expectation that students use
technology for appropriate educational purposes only. Although technology has the potential
to benefit students in their educational pursuits, making technology ubiquitously available
to students and teachers has the obvious risk that technology will be used inappropriately
on occasion. Thus most K-12 schools find it necessary, as a moral imperative, to monitor
Internet use and restrict student access to this technology and the information the
technology may provide.
                                             421
               Foundations of Learning and Instructional Design Technology
Researchers have suggested several possible methods for developing students’ ability to use
technologies more ethically. Bennett (2005) suggested using the National Education
Technology Standards (NETS•S) as a guide (see ISTE 2008b); however, while instructive,
these standards are not specific enough to inform direct strategies. Including ethical
training in teacher professional development has also been explored (Ben-Jacob, 2005;
Duncan & Barnett, 2010). Some academics feel it is the teacher’s responsibility to create a
safe and ethical learning environment with and without technology (Bennet, 2005; Milson,
2002). Several researchers have suggested classroom strategies for teachers. For example,
Kruger (2003) recommended teaching by example and working cyber ethics into
assignments and discussions. Baum (2005) echoed these ideas, adding that teachers should
create acceptable use policies with students and involve them in making pledges concerning
their ethical behavior. Ma, Lu, Turner, and Wan (2007) added that effectively designed
activities that are engaging and relevant to students’ interests encourage more ethical
technology use. Still other scholars have suggested using technology to combat
technological-based dishonesty through anti-plagiarism software (Jocoy & DiBiase, 2006) or
the use of webcams to verify that online students who complete the work are the same
students enrolled in the courses (Saunders, Wenzel, & Stivason, 2008). In addition,
instructors can make it a personal goal to stay abreast of technological developments and
their potential ethical implications (Howell, Sorensen, & Tippets, 2009). Finally, some
researchers have suggested building a supportive social community characterized by a
culture of academic honesty (Ma, Lu, Turner, & Wan, 2007; Wang, 2008) because “students
who feel disconnected from others may be prone to engage in deceptive behaviors such as
academic dishonesty” (Stuber-McEwen et al., 2009, p. 1).
Despite the concern expressed and implied in these suggestions, it is apparent that as a
society we have been slow in developing the ethics, norms, and cultural practices needed to
keep pace with technological advances (Traxler, 2010), leaving many teachers unaware of
proper “technoethics” (Pascual, 2005, p. 73). As we continue to increase access to and use
of technologies, it will become paramount to address these and other ethical considerations
if we are to succeed in promoting effective and sustainable technology integration.
                                             422
               Foundations of Learning and Instructional Design Technology
The complex and dynamic nature of the teaching and learning process contributes to the
difficulty of effective technology integration. For example, experts and stakeholders do not
always agree on what to teach and how to teach it (Woolfe, 2010). Also given the complexity
of most educational tasks, the certainty of accomplishing specific learning goals with or
without technology is often low (Patton, 2011). Thus establishing research-based
technology-enhanced instructional methods and best practices is challenging. However,
emerging research into the effective use of technology has identified some best practices by
considering issues such as (1) the need to focus on pedagogically-sound technology use, (2)
ways to use technology to personalize instruction, and (3) benefits of technology-enabled
assessment. An additional area of concern is the need for systemic changes at the
organizational level.
A major criticism of current teacher professional development efforts is that many of them
have emphasized improving teachers’ attitudes toward technology integration and
increasing their self-efficacy without a strong enough emphasis on pedagogically sound
practice. Some scholars have indicated that professional development goals must shift to
emphasize understanding and utilizing pedagogically sound technology practices (Inan &
Lowther, 2010). For example, Palak, and Walls (2009) explained that “future technology
professional development efforts need to focus on integration of technology into curriculum
via student-centered pedagogy while attending to multiple contextual conditions under
which teacher practice takes place” (p. 417). Similarly, Ertmer, and Ottenbreit-Leftwich
(2010) argued that “we need to help teachers understand how to use technology to facilitate
meaningful learning, defined as that which enables students to construct deep and
connected knowledge, which can be applied to real situations” (p. 257). According to
Cennamo, Ross, and Ertmer (2010), to achieve technology integration that targets student
learning, teachers need to identify which technologies support specific curricular goals.
Doing so would require understanding the technological tools themselves, as well as the
specific affordances of each tool that would enable students to learn difficult concepts more
readily, hopefully resulting in greater and more meaningful student outcomes (Ertmer &
Ottenbreit-Leftwich, 2010).
                                            423
               Foundations of Learning and Instructional Design Technology
(CK). In addition, they are expected to have technological knowledge in general (TK), along
with an understanding of how specific technologies might facilitate student learning of
specific content in a pedagogically sound way (TPCK). TPACK proponents argue that
teachers must understand the connections between these knowledge areas so that
instructional decisions regarding technology integration are pedagogically sound and
content driven.
Since TPACK emerged as a theoretical framework, researchers have explored its potential
professional development applications (Cavin, 2008), as well as ways to assess teachers’
abilities and skills in this area (Kang, Wu, Ni, & Li, 2010; Schmidt et al., 2009). However,
work in this area is still ongoing, and methods and principles for creating effective TPACK-
related professional development and measurement should continue to develop as an area
of research.
Most educators hope to personalize instruction for their students, which generally includes
identifying the needs and capabilities of individual learners; providing flexibility in
scheduling, assignments, and pacing; and making instruction relevant and meaningful for
the individual student (Keefe, 2007). The goal of personalizing instruction usually means
rejecting the “one size fits all” model of education and replacing it with customized
instruction. The idea of personalized or differentiated instruction is not new (Keefe &
Jenkins, 2002; Tomlinson, 2003); however the potential for technology to facilitate
differentiation is appealing to many educators (Woolfe, 2010).
Much of the educational software currently being used in schools focuses on content
delivery (with some pacing flexibility and assessment) or on knowledge management
systems using information communication technology, but not necessarily customization
that tailors instruction to the individual needs of the learner. Computer software used in
                                             424
                Foundations of Learning and Instructional Design Technology
K-12 education has primarily involved drill and practice for developing reading and
mathematics skills (i.e., computer-based instructional products). Improving basic word
processing skills (i.e., typing) is also a prevalent technology-facilitated instructional activity
taking place in schools (Ross, Morrison, & Lowther, 2010). These educational software
programs are intended to supplement the work of teachers rather than replacing them and
are typically not integrated directly into classroom instruction.
Since 2002 the cost of testing in schools has increased significantly (U.S. Government
Accountability Office, 2009). Testing costs result primarily from accountability mandates
that emphasize increased achievement on state standardized tests. With the current
imperative to adopt common core standards and establish national online standardized
testing in the U.S., the need for technology-enabled assessment will only increase (Toch &
Tyre, 2010), including the use of computer-adaptive testing techniques and technologies.
The major concern with these initiatives is that schools are not now, nor in the immediate
future will they be, equipped to handle the requirements of large scale online testing in
terms of access to computers and the internet, as well as the networking infrastructure
needed (Deubel, 2010; Toch & Tyre, 2010).
One of the greatest benefits of online testing is the potential for teachers and individual
                                               425
               Foundations of Learning and Instructional Design Technology
students to get immediate results (Deubel, 2010; Toch & Tyre, 2010). State standardized
testing in its current form does little to improve learning for individual students, as the lag
time between taking a test and receiving the results prevents the information from being
useful. In addition, most standardized assessments are not designed to help individual
students (Marzano, 2009). Embedding assessment into the learning activities for both
formative and diagnostic purposes can be facilitated by using technology, but the ability to
do this is at the emergent stage. Critics of technology-enabled assessment have pointed out
that the tools required to accomplish this type of testing are far from adequate.
The desire to benefit from having computerized assessment systems in schools may be
compromised by a lack of quality. For example, while assessment vendors claim high
correlations between the results of computer-scored and human-scored writing tests (Elliot,
2003), critics have described serious flaws in the process (McCurry, 2010; Miller, 2009).
Writing software using computer scoring can be programmed to identify language patterns,
basic writing conventions, and usage issues; the software cannot, however, read for
meaning, creativity, or logical argument (McCurry, 2010), which are more important
aspects of literacy development. Thus the accuracy and validity of computer-scored writing
assessments are suspect. At this time, schools using these technologies are forced into a
tradeoff between quality assessment and practicality (Miller, 2009). However, computer-
scored writing assessment is an area of great interest in schools.
Another criticism of current assessment trends relates to how tests are developed and used.
Diagnostic formative assessments should be narrower in focus, more specific in content
coverage, and more frequent than the summative standardized testing currently being
mandated for accountability purposes (Cizek, 2010b; Marzano, 2009). For this type of
testing to become a reality, students would need better access to personal computers or
mobile devices, school networks, and the internet (Toch & Tyre, 2010). In addition,
instructional software would have to be aligned with approved learning objectives (Cizek,
2010b). Assessment would need to be integrated into the learning process more thoroughly,
with instructional software designed to monitor and test the progress of students and then
provide prompt feedback to each individual learner (Marzano, 2009). We expect teachers to
provide formative assessment and feedback to their students, but teachers are often
overwhelmed by the task. Technology has the potential to facilitate learning by enabling this
process, but greater advancements in this area are needed to make this a workable reality
(Woolfe, 2010).
While TPACK and other pedagogically driven technology integration efforts are an
improvement in the drive towards more effective use of educational technologies, to focus
on pedagogically sound technology use alone would be insufficient for lasting change. Many
teachers and educational technologists have learned that even when teachers adopt
technologies and learn how to use them in pedagogically appropriate ways, they are
hampered in their integration efforts by the educational system. Thus as Sangra and
                                              426
               Foundations of Learning and Instructional Design Technology
We find it surprising that scholars appear to be lagging in this effort to understand systemic
influences on technology integration. As Tondeur, van Keer, van Braak, and Valcke (2008)
reported, research on technology in schools is focused mostly on classroom rather than
organizational variables. Additionally, there seems to be a major gap in the literature
regarding the development of a technology integration framework that, like TPACK, is
pedagogically driven but sensitive to systemic variables. We are unsure what an
“organizational TPACK” model would look like, but we believe this to be a potentially fruitful
research endeavor for the next decade.
                                              427
               Foundations of Learning and Instructional Design Technology
Conclusions
Legislative mandates for schools to utilize educational technologies in classrooms are based
on the belief that technology can improve instruction and facilitate learning. Another widely
held belief is that students need to develop technology literacy and skills in order to become
productive members of society in a competitive global economy. This chapter explored
school technology integration efforts as progressive steps: increasing access to educational
technologies, increasing ubiquitous technology use, and improving effective technology
implementation.
Over the past decade, one-to-one computing programs have been the most prominent
initiatives used to increase access to technology in schools. These initiatives are designed to
increase the availability of primarily digital technologies and related software for teachers
and students. The biggest access obstacle has been the cost of obtaining and maintaining
technology resources. The Open Educational Resource (OER) movement is attempting to
alleviate some of the cost associated with providing quality educational resources, but OER
programs struggle with sustainability issues. The cost of providing and maintaining
technology as well as the way federal programs fund technology initiatives have often
resulted in uneven levels of access, creating pockets of technology-rich schools.
While technology availability in schools has increased significantly over the past decade,
measures of access likely provide an overenthusiastic impression of progress in effective
technology integration and use. Having greater access to and improved use of technology
(i.e., computer and internet availability) has not always led to substantial increases in
learning. Typically, studies refer to technology’s potential for increasing learning but
acknowledge that any scholastic benefit depends on factors other than simply having
technology access.
Once schools have access to educational technologies, the focus of technology integration
often turns to increasing technology use. Researchers have reported that even when
teachers and students have sufficient access, they do not always use technology for
instructional purposes. Issues that hinder technology use in schools include social and moral
ethics, like the question of inequitable access to technology for all students, which causes
some teachers to avoid requiring students to use technologies to do assignments at home.
Many schools also find it necessary to restrict the use of various technologies due to
potential negative consequences and ethical dilemmas, considering it a moral imperative to
monitor internet use and limit student access to this technology.
                                              428
               Foundations of Learning and Instructional Design Technology
connections between the specific affordances of various technologies and the ways each tool
might best be used to facilitate specific content learning.
                                             429
           Foundations of Learning and Instructional Design Technology
Key Takeaways
                                         430
               Foundations of Learning and Instructional Design Technology
Application Exercises
          After reading the chapter, what do you believe to be the number one barrier
          to having technology used in the classroom? Share how you would overcome
          this?
          Think about how you currently use technology in your formal education
          settings. How is it being used effectively? How could it be integrated more
          effectively?
          If you were to hold a professional development for teachers to help increase
          skills and self-efficacy in their use of technology in the classroom, what would
          that training look like? Use research from the article to support your plan.
References
Akbulut, Y., Sendag, S., Birinci, G., Kilicer, K., Sahin, M. C., & Odabasi, H. F. (2008).
Exploring the types and reasons of internet-triggered academic dishonesty among Turkish
undergraduate students: Development of internet-triggered academic dishonesty scale
(ITADS). Computers & Education, 51(1), 463–473.
Annetta, L., Murray, M., Gull Laird, S., Bohr, S., & Park, J. (2008). Investigating student
attitudes toward a synchronous, online graduate course in a multi-user virtual learning
environment. Journal of Technology and Teacher Education, 16(1), 5–34.
Atkins, D. E., Seely Brown, J., & Hammond, A. L. (2007). A review of the Open Educational
Resources (OER) movement: Achievements, challenges, and new opportunities. A Report to
The William and Flora Hewlett Foundation. Retrieved from
http://www.oerderves.org/wp-content/uploads/2007/03/a-review-of-the-open-educational-res
ources-oer-movement_final.pdf
Bahrampour, T. (2006, December 9). For some, laptops don’t computer: Virginia school
pushes wireless learning. Washinton Post, p.1.
Bai, H., & Ertmer, P. (2008). Teacher educators’ beliefs and technology uses as predictors of
preservice teachers’ beliefs and technology attitudes. Journal of Technology and Teacher
Education, 16(1), 93–112.
Bauer, J., & Kenton, J. (2005). Toward technology integration in the schools: Why it isn’t
happening. Journal of Technology and Teacher Education, 13(4), 519–546.
Baum, J. J. (2005). CyberEthics: The new frontier. TechTrends: Linking Research & Practice
to Improve Learning, 49(6), 54–56.
                                              431
               Foundations of Learning and Instructional Design Technology
Bausell, C.V. (2008). Tracking U.S. trends. Education Week: Technology counts, 27(30),
39–42.
Bebell, D., & Kay, R. (2010). One to one computing: A summary of the quantitative results
from the Berkshire Wireless Learning Initiative. Journal of Technology, Learning, and
Assessment, 9(2), 4–58.
Bennett, L. (2005). Guidelines for using technology in the social studies classroom. Social
Studies, 96(1), 38.
Ben-Jacob, M. (2005). Integrating computer ethics across the curriculum: A case study.
Educational Technology & Society, 8(4), 198–204.
Borup, J., West, R. E., & Graham, C. R. (2012). Improving online social presence through
asynchronous video. Internet and Higher Education. doi: 10.1016/j.iheduc.2011.11.001
Calandra, B., Brantley-dias, L., Lee, J. K., & Fox, D. L. (2009). Using video editing to
cultivate novice teachers’ practice. Journal of Research on Technology in Education, 42(1),
73–94.
Cheesman, E., Winograd, G., & Wehrman, J. (2010). Clickers in teacher education: Student
perceptions by age and gender. Journal of Technology and Teacher Education, 18(1), 35–55.
Choy, D., & Wong, A. F. L. (2009). Student teachers’ intentions and actions on integrating
technology into their classrooms during student teaching: A Singapore study. Journal of
Research on Technology in Education, 42(2), 175–195.
Cennamo, K. S., Ross, J. D., & Ertmer, P. A. (2010). Technology integration for meaningful
classroom use: A standards-based approach. Belmont, CA: Wadsworth, Cengage Learning.
Center for Digital Education (2008). A complete guide to one-to-one computing. Retrieved
from http://www.one-to-oneinstitute.org/files/CDE07_Book_MPC_K12.pdf
-oneinstitute.org/files/CDE07_Book_MPC_K12.pdf
                                             432
               Foundations of Learning and Instructional Design Technology
Cooper, M. (2010). Charting a course for software licensing and distribution. Proceedings of
the 38th Annual Fall Conference on SIGUCCS 2010. doi:10.1145/1878335.1878375
Cuban, L. (2006a, October 18). Commentary: The laptop revolution has no clothes.
Education Week, p. 29. Retrieved from
http://www.edweek.org/ew/articles/2006/10/18/08cuban.h26.html
Cuban, L. (2006b, October 31). 1:1 laptops transforming classrooms: Yeah, sure. New York,
NY: Teachers College Record. Retrieved from
http://www.tcrecord.org/Content.asp?ContentId=12818
Davies, R, (2003) Learner intent and online courses. Journal of Interactive Online Learning,
2(1), 1-10. Retrieved from http://www.ncolr.org/jiol/issues/pdf/2.1.4.pdf
Davies, R., Sprague, C., & New, C. (2008) Integrating technology into a science classroom:
An evaluation of inquiry-based technology integration. In D.W. Sunal, E.L. Wright, & C.
Sundberg (Eds.), The impact of technology and the laboratory on K–16 science learning
series: Research in science education (pp. 207–237). Charlotte, NC: Information Age
Publishing, Inc.
Derham, C., & DiPerna, J. (2007). Digital professional portfolios of preservice teaching: An
initial study of score reliability and validity. Journal of Technology and Teacher Education,
15(3), 363–381.
Deubel, P., (2010). Are we ready for testing under common core state standards? The
Journal. Retrieved from
                                             433
                Foundations of Learning and Instructional Design Technology
http://thejournal.com/articles/2010/09/15/are-we-ready-for-testing-under-common-core-state
-standards.aspx
Dyal, A., Carpenter, L. B., & Wright, J. V. (2009). Assistive technology: What every school
leader should know. Education, 129(3), 556–560.
Elliot, S. (2003). Intellimetric: From here to validity. In M. D. Shermis & J. Burstein (Eds.),
Automated essay scoring: A cross-disciplinary perspective (pp. 71–86). Mahwah, NJ:
Lawrence Erlbaum Associates, Inc.
* Ertmer, P.A., & Ottenbreit-Leftwich, A.T. (2010). Teacher technology change: How
knowledge, confidence, beliefs, and culture intersect culture. Journal of Research on
Technology in Education, 42(3), 255–284.
Facer, K., & Sandford, R. (2010). The next 25 years? Future scenarios and future directions
for education and technology. Journal of Computer Assisted Learning, 26(1), 74–93.
Fletcher, G., & Lu J. (2009). Human computing skills: Rethinking the K–12 experience.
Communications of the ACM, 52(2), 23–25. doi:10.1145/1461928.1461938
Garland, V. E. (2010). Emerging technology trends and ethical practices for the school
principal. Journal of Educational Technology Systems, 38(1), 39–50.
Gentry, L. B., Denton, C. A., & Kurz, T. (2008). Technologically-based mentoring provided to
teachers: A synthesis of the literature. Journal of Technology and Teacher Education, 16(3),
339–373.
Gibson, S., & Kelland, J. (2009). Connecting preservice teachers with children using blogs.
Journal of Technology and Teacher Education, 17(3), 299–314.
* Gray, L., Thomas, N., & Lewis, L. (2010). Teachers’ use of educational technology in U.S.
public schools: 2009 (NCES 2010-040). National Center for Education Statistics, Institute of
Education Sciences, U.S. Department of Education. Washington, DC.
Greaves, T. W., & Hayes, J. (2008). America’s digital schools 2008: The six trends to watch.
Encinitas, CA: Greaves Group and Hayes Connection.
Gunn, C. (2010). Sustainability factors for e-learning initiatives. ALT-J: Research in Learning
Technology, 18(2), pp. 89–103.
Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content
knowledge and learning activity types: Curriculum-based technology Integration Reframed.
Journal of Research on Technology in Education, 41(4), 393–416.
                                              434
               Foundations of Learning and Instructional Design Technology
Hohlfeld, T. N., Ritzhoupt, A.D., Barron, A.E., & Kemker, K. (2008). Examining the digital
divide in P-12 public schools: Four-year trends for supporting ICT literacy in Florida.
Computers & Education, 51(4), 1648–1663.
Howell, S. L., Sorensen, D., & Tippets, H. R. (2009). The new (and old) news about cheating
for distance educators. Online Journal of Distance Learning Administration, 12(3).
Hur, J. W., & Brush, T. A. (2009). Teacher participation in online communities: Why do
teachers want to participate in self-generated online communities of K-12 teachers? Journal
of Research on Technology in Education, 41(3), 279–304.
Huysman, M., Steinfield, C., David, K., Poot, J. A. N., & Mulder, I. (2003). Virtual teams and
the appropriation of communication technology: Exploring the concept of media stickiness.
Computer Supported Cooperative Work, 12,411–436.
Inan, F.A., & Lowther, D.L. (2010). Factors affecting technology integration in K-12
classrooms: A path model. Educational Technology Research and Development, 58(2),
137–154.
Jager, K. de, & Brown, C. (2010). The tangled web: Investigating academicsʼ views of
plagiarism at the University of Cape Town. Studies in Higher Education, 35(5), 513–528.
Jocoy, C., & DiBiase, D. (2006). Plagiarism by adult learners online: A case study in
detection and remediation. International Review of Research in Open and Distance
Learning, 7(1), 1–15.
Jones, N. B., & Graham, C. (2010). Improving hybrid and online course delivery emerging
technologies. In Y. Kats (Ed.), Learning management system technologies and software
solutions for online teaching: Tools and applications (pp. 239-258).
doi:10.4018/978-1-61520-853-1.ch014
Kang, M., Heo, H., Jo, I., Shin, J., & Seo, J. (2010-2011). Developing an educational
performance indicator for new millennium learners. Journal of Research on Technology in
Education, 43(2), 157–170.
Kang, J.J., Wu, M.L., Ni, X., & Li, G. (2010). Developing a TPACK assessment framework for
evaluating teachers’ knowledge and practice to provide ongoing feedback. In Proceedings of
World Conference on Educational Multimedia, Hypermedia and Telecommunications 2010
(pp. 1980–1983). Chesapeake, VA: AACE.
                                             435
               Foundations of Learning and Instructional Design Technology
Keefe, J., & Jenkins, J. (2002). Personalized instruction. Phi Delta Kappan, 83(6), pp.
440–448.
King, C. G., Jr., Roger W. G. Jr., & Piotrowski, C. (2009). Online exams and cheating: An
empirical analysis of business students’ views. Journal of Educators Online, 6(1), 1–11.
Koehler, M.J., Mishra, P., & Yahya, K. (2007). Tracing the development of teacher
knowledge in a design seminar: Integrating content, pedagogy, & technology. Computers
and Education, 49(3), 740–762.
* Koehler, M.J., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation
and Technology (Ed.), The handbook of technological pedagogical content knowledge
(TPCK) for educators. New York, NY: American Association of Colleges of Teacher
Education and Routledge.
Kruger, R. (2003). Discussing cyber ethics with students is critical. Social Studies, 94(4),
188–189.
Lambert, J., Gong, Y., & Cuper, P., (2008). Technology, transfer, and teaching: The impact of
a single technology course on preservice teachers’ computer attitudes and ability. Journal of
Technology and Teacher Education, 16(4), 385–410.
Li, S.C. (2010). Social capital, empowerment and educational change: A scenario of
permeation of one-to-one technology in school. Journal of Computer Assisted Learning,
26(4), 284–295.
Lin, H. (2007). The ethics of instructional technology: Issues and coping strategies
experienced by professional technologists in design and training situations in higher
education. Educational Technology Research and Development, 55(5), 411–437.
Lowther, D.L., & Ross, S.M. (2012). Instructional designers and P-12 technology integration.
In R.A. Reiser & J.V. Dempsey (Eds.), Trends and issues in instructional design and
technology (pp. 208–217). Boston, MA: Pearson Education, Inc.
                                              436
               Foundations of Learning and Instructional Design Technology
Ma, H., Wan, G., & Lu, E. Y. (2008). Digital cheating and plagiarism in schools. Theory Into
Practice, 47(3), 197–203.
Marshall, S. (2010). Change, technology and higher education: Are universities capable of
organizational change? ALT-J: Research in Learning Technology, 18(3), 179–192.
McCaughtry, N., & Dillon, S. R. (2008). Learning to use PDAs to enhance teaching: The
perspectives of preservice physical educators. Journal of Technology and Teacher
Education, 16(4), 483–508.
McCurry, D. (2010). Can machine scoring deal with broad and open writing tests as well as
human readers? Assessing Writing, 15(2), 118–129. doi:10.1016/j.asw.2010.04.002
McMillan-Culp, K., Honey, M., & Mandinach, E. (2005). A retrospective on twenty years of
educational technology policy. Journal of Educational Computing Research, 32(3), 279–307.
Retrieved from http://courses.ceit.metu.edu.tr/ceit626/week12/JECR.pdf
Milson, A. J., & Chu, B.W. (2002). Character education for cyberspace: Developing good
netizens. Social Studies, 93(3), 117–119.
Nagel, D. (2010, May 5). Report: Mobile and classroom technologies surge in schools. The
Journal. Retrieved from
http://thejournal.com/articles/2010/05/05/report-mobile-and-classroom-technologies-surge-in
-schools.aspx
Niederhauser, D. S., & Lindstrom, D. L. (2006). Addressing the NETS for students through
constructivist technology use in K-12 classrooms. Journal of Educational Computing
Research, 34(1), 91–128.
                                             437
               Foundations of Learning and Instructional Design Technology
Osguthorpe, R.T., Osguthorpe, R.T., Jacob, J. & Davies, R. (2003). The moral design of
instruction. Educational Technology, 43(2), 19–23.
Overbaugh, R., & Lu, R. (2008). The impact of a NCLB-EETT funded professional
development program on teacher self-efficacy and resultant implementation. Journal of
Research on Technology in Education, 41(1), 43–61.
Owusua, K.A., Monneyb, K.A., Appiaha, J.Y., & Wilmota, E.M. (2010) Effects of computer-
assisted instruction on performance of senior high school biology students in Ghana.
Computers & Education, 55(2), 904–910. doi:10.1016/j.compedu.2010.04.001
Palak, D., & Walls, R. T. (2009). Teachers’ beliefs and technology practices: A mixed-
methods approach. Journal of Research on Technology in Education, 41(4), 417–441.
Pascual, P.C. (2005). Educational technoethics: As a means to an end. AACE Journal, 13(1),
73–90.
Rickard, A., McAvinia, C., & Quirke-Bolt, N. (2009). The challenge of change: Digital video-
analysis and constructivist teaching approaches on a one year preservice teacher education
program in Ireland. Journal of Technology and Teacher Education, 17(3), 349–367.
Richey, R. C., Silber, K. H., & Ely, D. P. (2008). Reflections on the 2008 AECT definitions of
the field. TechTrends, 52(1), 24–25.
Ross, S.M., & Lowther, D.L. (2009). Effectively using technology in education. Better
Evidence-Based Education, 2(1), 20–21.
Ross, S.M., Morrison, G., & Lowther, D.L. (2010). Educational technology research past and
present: Balancing rigor and relevance to impact school learning. Contemporary
Educational Technology, 1(1), 17–35.
Russell, M., Bebell, D., O’Dwyer, L., & O’Connor, K. (2003). Examining teacher technology
use: Implications for preservice and inservice teacher preparation. Journal of Teacher
Education, 54(4), 297–310.
Russell, M., & Douglas, J. (2009). Comparing self-paced and cohort-based online courses for
                                             438
               Foundations of Learning and Instructional Design Technology
Samuels, L.B., & Bast, C.M. (2006). Strategies to help legal studies students avoid
plagiarism. Journal of Legal Studies Education, 23(2), 151–167.
Sangra, A., & Gonzalez-Sanmamed, M. (2010). The role of information and communication
technologies in improving teaching and learning processes in primary and secondary
schools. ALT-J: Research in Learning Technology, 18(3): 207–220.
Saunders, G., Wenzel, L., & Stivason, C.T. (2008). Internet courses: Who is doing the work?
Journal of College Teaching & Learning, 5(6), 25–35.
Shapley, K.S., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2010). Evaluating the
implementation fidelity of technology immersion and its relationship with student
achievement. Journal of Technology, Learning, and Assessment, 9(4), 6–10.
Schmidt, D.A., Baran, E., Thompson, A.D., Mishra, P. Koehler, M.J., & Shin, T.S. (2009).
Technological pedagogical content knowledge (TPACK): The development and validation of
an assessment instrument for preservice teachers. Journal of Research on Technology in
Education, 42(2), 123–149.
Stuber-McEwen, D., Wiseley, P., & Hoggatt, S. (2009). Point, click, and cheat: Frequency
and type of academic dishonesty in the virtual classroom. Online Journal of Distance
Learning Administration, 12(3), 1–10.
Stucker, H. (2005). Digital “natives” are growing restless. School Library Journal, 51(6),
9–10.
Traxler, J. (2010). Students and mobile devices. ALT-J: Research in Learning Technology,
18(2), 149–160.
* Toch, T., & Tyre, P. (2010). How will the common core initiative impact the testing
industry? Washington, DC: Thomas B. Fordham Institute. Retrieved from
http://spencer.jrn.columbia.edu/wp-content/uploads/2010/03/Tyre_Fordham.pdf
Tomlinson, C. (2003). Fulfilling the promise of the differentiated classroom: Strategies and
tools for responsive teaching.Alexandria, VA: Association for Supervision and Curriculum
Development.
Tondeur, J., van Keer, H., van Braak, J., & Valcke, M. (2008). ICT integration in the
classroom: Challenging the potential of a school policy. Computers & Education, 51(1):
212–223.
Topol, B., Olson, J., & Roeber, E. (2010). The cost of new higher quality assessments: A
comprehensive analysis of the potential costs for future state assessments. Stanford, CA:
Stanford University, Stanford Center for Opportunity Policy in Education. Retrieved from
                                             439
               Foundations of Learning and Instructional Design Technology
http://edpolicy.stanford.edu/pages/pubs/pub_docs/assessment/scope_pa_topol.pdf
Turner, C.C. (2005). A new honesty for a new game: Distinguishing cheating from learning
in a web-based testing environment. Journal of Political Science Education, 1(2), 163–174.
Van Dam, A., Becker, S., & Simpson, R.M., (2007). Next-generation educational software:
Why we need it & a research agenda for getting it, SIGGRAPH ’07: ACM SIGGRAPH 2007
Courses, ACM, New York, NY, USA, p. 32.
Vandewaetere, M., Desmet, P., & Clarebout, G. (2011). Review: The contribution of learner
characteristics in the development of computer-based adaptive learning environments.
Computers in Human Behavior, 27(1), 118–130. doi:10.1016/j.chb.2010.07.038
Vigdor, J.L., & Ladd, H.F. (2010). Scaling the digital divide: Home computer technology and
student achievement. Working Paper no. 16078, National Bureau of Economic Research,
Cambridge. Retrieved from http://www.nber.org/papers/w16078
Warschauer, M., & Matuchniak, T. (2010). New technology and digital worlds: Analyzing
evidence of equity in access, use, and outcomes. Review of Research in Education, 34(1),
179–225. doi: 10.3102/0091732X09349791
Warschauer, M., & Ames, M. (2010). Can one laptop per child save the world’s poor? Journal
of International Affairs, 64(1), 33–51.
Warschauer, M. (2010, May). Netbooks and open source software in one-to-one programs.
                                            440
               Foundations of Learning and Instructional Design Technology
West, R.E., Rich, P.J., Shepherd, C.E., Recesso, A., & Hannafin, M.J. (2009). Supporting
induction teachers’ development using performance-based video evidence. Journal of
Technology and Teacher Education, 17(3), 369–391.
West, R.E., Waddoups, G., & Graham, C.R. (2007). Understanding the experiences of
instructors as they adopt a course management system. Educational Technology, Research,
and Development, 55(1), 1–26. doi: 10.1007/s11423-006-9018-1
Weston, M.E., & Bain, A. (2010). The end of techno-critique: The naked truth about 1:1
laptop initiatives and educational change. Journal of Technology, Learning, and Assessment,
9(6), 4–19. Retrieved from http://www.jtla.org
Yang, F. (2010). The ideology of intelligent tutoring systems. ACM Inroads, 1(4), 63–65.
doi:10.1145/1869746.1869765
Zucker, A.A., & Light, D. (2009). Laptop programs for students. Science, 323(5910), 82–85.
Retrieved from http://www.sciencemag.org/content/323/5910/82.full
                                             441
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        442
                         Randall S. Davies
                                     443
                           Richard E. West
He tweets @richardewest, and his research can be found on Google Scholar and
his website: http://richardewest.com.
                                       444
                                           32
Royce Kimmons
Editor’s Note
    The following is excerpted and adapted from Dr. Royce Kimmon’s open textbook,
    K-12 Technology Integration [https://edtechbooks.org/-UeB]. It is licensed CC BY-
    SA 3.0 [https://edtechbooks.org/-AYY].
                                            445
               Foundations of Learning and Instructional Design Technology
TPACK
TPACK is the most commonly used technology integration model amongst educational
researchers. The goal of TPACK is to provide educators with a framework that is useful for
understanding technology’s role in the educational process. At its heart, TPACK holds that
educators deal with three types of core knowledge on a daily basis: content knowledge,
pedagogical knowledge, and technological knowledge. Content knowledge is knowledge of
one’s content area, such as science, math, or social studies. Pedagogical knowledge is
knowledge of how to teach. And technological knowledge is knowledge of how to use
technology tools.
These core knowledge domains, however, interact with and build on each other in important
and complicated ways. For instance, if you are going to teach kindergarten mathematics,
you must understand both mathematics (i.e., content knowledge) and how to teach (i.e.,
pedagogical knowledge), but you must also understand the relationship between pedagogy
and the content area. That is, you must understand how to teach mathematics, which is very
different from teaching other subject areas, because the pedagogical strategies you use to
teach mathematics will be specific to that content domain. When we merge content
knowledge and pedagogical knowledge together, a hybrid domain emerges called
pedagogical content knowledge. Pedagogical content knowledge includes knowledge about
content and pedagogy, but it also includes the specific knowledge necessary to teach the
specified content in a meaningful way.
                                            446
               Foundations of Learning and Instructional Design Technology
TPACK goes on to explain that when we try to integrate technology into a classroom setting,
we are not merely using technological knowledge, but rather, we are merging technological
knowledge with pedagogical content knowledge to produce something new. TPACK or
technological pedagogical content knowledge is the domain of knowledge wherein
technology, pedagogy, and content meet to create a meaningful learning experience. From
this, educators need to recognize that merely using technology in a classroom is not
sufficient to produce truly meaningful technology integration. Rather, teachers must
understand how technology, pedagogy, and content knowledge interact with one another to
produce a learning experience that is meaningful for students in specific situations.
                                           447
               Foundations of Learning and Instructional Design Technology
RAT is an acronym for replace, amplify, and transform, and the model holds that when
technology is used in a teaching setting, technology is used either to replace a traditional
approach to teaching (without any discernible difference on student outcomes), to amplify
the learning that was occurring, or to transform learning in ways that were not possible
without the technology (Hughes, Thomas, & Scharber, 2006). Similarly, SAMR is an
acronym for substitution, augmentation, modification, and redefinition (Puentedura, 2003).
To compare it to RAT, substitution and replacement both deal with technology use that
merely substitutes or replaces previous use with no functional improvement on efficiency.
Redefinition and transformation both deal with technology use that empowers teachers and
students to learn in new, previously impossible ways.
The difference between these two models rests in the center letters, wherein RAT’s
amplification is separated into two levels as SAMR’s augmentation and modification. All of
these levels deal with technology use that functionally improves what is happening in the
classroom, but in the SAMR model, augmentation represents a small improvement, and
modification represents a large improvement.
Both of these models are helpful for leading educators to consider the question: What effect
is using the technology having on my practice? If the technology is merely replacing or
substituting previous practice, then it is a less meaningful use of technology, whereas
technology use that transforms or redefines classroom practice is considered to be more
valuable.
                                            448
               Foundations of Learning and Instructional Design Technology
PICRAT
Building off of the ideas presented in the models above, we will now provide one final model
that may serve as a helpful starting point for teachers to begin thinking about technology
integration. PIC-RAT assumes that there are two foundational questions that teachers must
ask about any technology use in their classrooms:
The provided illustration maps these two questions on a two-dimensional grid, and by
answering these two questions, teachers can get a sense for where any particular practice
falls.
Figure 3. PIC-RAT
                                            449
               Foundations of Learning and Instructional Design Technology
For instance, if a history teacher shifts from writing class notes on a chalkboard to providing
these notes in a PowerPoint presentation, this would likely be categorized in the bottom-left
(PR) section of the grid, because the teacher is using the technology to merely replace a
traditional practice, and the students are passively taking notes on what they see. In
contrast, if an English teacher guides students in developing a creative writing blog, which
they use to elicit feedback from peers, parents, and the online community on their short
stories, this would likely be categorized in the top-right (CT) section, because the teacher is
using the technology to transform the practice to do something that would have been
impossible without the technology, and the students are using the technology as a tool for
creation.
Experience has shown that as teachers begin using technologies in their classrooms, they
will typically begin doing so in a manner that falls closer to the bottom-left of the grid.
However, many of the most exciting and valuable uses of technology for teaching rest firmly
in the top-most and right-most sections of this grid. For this reason, teachers need to be
encouraged to evolve their practice to continually move from the bottom-left (PR) to the top-
right (CT) of the grid.
                                             450
Foundations of Learning and Instructional Design Technology
                           451
            Foundations of Learning and Instructional Design Technology
Further Resource
For more information on the PIC-RAT model, please view this video
[https://youtu.be/bfvuG620Bto], scripted by Dr. Kimmons and Dr. Richard E. West
of Brigham Young University.
                                               452
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        453
                           Royce Kimmons
                                      454
                                            34
Editor’s Note
     This article, the third in a series of four installments, begins by discussing the
     need for paradigm change in education and for a critical systems approach to
     paradigm change, and examines current progress toward paradigm change.
     Then it explores what a learner-centered, Information-Age educational system
     should be like, including the APA learner-centered psychological principles, the
     National Research Council’s findings on how people learn, the work of
     McCombs and colleagues on learner-centered schools and classrooms,
     personalized learning, differentiated instruction, and brain-based instruction.
     Finally, one possible vision of a learner-centered school is described.
                                            455
               Foundations of Learning and Instructional Design Technology
how states can help their school districts to engage in paradigm change. This article
describes the nature of the learner-centered paradigm of education, and it addresses why
this paradigm is needed. The final article (November–December) will explore a full range of
roles that technology might play in this new paradigm of education.
Introduction
The dissatisfaction with and loss of trust in schools that we are experiencing these days are
clear hallmarks of the need for change in our school systems. The strong push for a learner-
centered paradigm of instruction in today’s schools reflects our society’s changing
educational needs. We educators must help our schools to move into the new learner-
centered paradigm of instruction that better meets the needs of individual learners, of their
work places and communities, and of society in general. It is also important that we
educators help the transformation occur as effectively and painlessly as possible. This
article begins by addressing the need for transforming our educational systems to the
learner-centered paradigm. Then it describes the nature of the learner-centered paradigm.
Whereas society has shifted from the Industrial Age into what many call the ‘Information
Age’ (Toffler, 1984; Reigeluth, 1994; Senge, Cambron-McCabe, Lucas, Smith, Dutton, &
Kleiner, 2000), current schools were established to fit the needs of an Industrial-Age society
(see Table 1). This factory-model, Industrial-Age school system has highly
compartmentalized learning into subject areas, and students are expected to learn the same
content in the same amount of time (Reigeluth, 1994). The current school system strives for
standardization and was not designed to meet individual learners’ needs. Rather it was
designed to sort students into laborers and managers (see Table 2), and students are forced
to move on with the rest of the class regardless of whether or not they have learned the
material, and thus many students accumulate learning deficits and eventually drop out.
Table 1. Key markers of Industrial vs. Information Age education (Reigeluth, 1994).
                                             456
                Foundations of Learning and Instructional Design Technology
Compliance                                           Initiative
Conformity                                           Diversity
One-way communications                               Networking
Compartmentalization (division of labor)             Holism (integration of tasks)
Systemic educational change draws heavily from the work on critical systems theory (CST)
(Flood & Jackson, 1991; Jackson, 1991a, 1991b; Watson, Watson, & Reigeluth, 2008). CST
has its roots in systems theory, which was established in the mid-twentieth century by a
multi-disciplinary group of researchers who shared the view that science had become
increasingly reductionist and the various disciplines isolated. While the term system has
been defined in a variety of ways by different systems scholars, the central notion of systems
theory is the importance of relationships among elements comprising a whole.
CST draws heavily on the philosophy of Habermas (1973, 1984, 1987). The critical systems
approach to social systems is of particular importance when considering systems wherein
inequality of power exists in relation to opportunity, authority, and control. In the 1980s,
CST came to the forefront (Jackson, 1985; Ulrich, 1983), influencing systems theory into the
1990s (Flood & Jackson, 1991; Jackson, 1991a, 1991b). Liberating Systems Theory uses a
post-positivist approach to analyze social conditions in order to liberate the oppressed, while
also seeking to liberate systems theory from tendencies such as self-imposed insularity,
cases of internal localized subjugations in discourse, and liberation of system concepts from
                                               457
               Foundations of Learning and Instructional Design Technology
the inadequacies of objectivist and subjectivist approaches (Flood, 1990). Jackson (1991b)
explains that CST embraces five key commitments:
Banathy (1991) and Senge et al. (2000) apply systems theory to the design of educational
systems. Banathy (1992) suggests examining systems through three lenses: a “still picture
lens” to appreciate the components comprising the system and their relationships; a
“motion picture lens” to recognize the processes and dynamics of the system; and a “bird’s
eye view lens” to be aware of the relationships between the system and its peers and
suprasystems. Senge et al. (2000) applies systems theory specifically to organizational
learning, stating that the organization can learn to work as an interrelated, holistic learning
community, rather than functioning as isolated departments.
There are also stories of school districts making fundamental changes in schools based on
the application of systemic change ideas. One of the best practices of systemic
transformation is in the Chugach School District (CSD). The students in CSD are scattered
throughout 22,000 square miles of remote area in South-central Alaska. The district was in
crisis twelve years ago due to low student reading ability, and the school district committed
to a systemic transformation effort. Battino and Clem (2006) explain how the CSD’s use of
individual learning plans, student assessment binders, student learning profiles, and student
life-skills portfolios support and document progress toward mastery in all standards for each
learner. The students are given the flexibility to achieve levels at their own pace, not having
to wait for the rest of the class or being pushed into learning beyond their developmental
level. Graduation standards exceed state requirements as students are allowed extra time to
achieve that level if necessary, but must meet the high rigor of the graduation level. Student
accomplishment in academic performance skyrocketed as a result of these systemic changes
(Battino & Clem, 2006).
Caine (2006) also found strong positive changes through systemic educational change in
                                              458
               Foundations of Learning and Instructional Design Technology
With significant research showing that instruction should be learner-centered to meet all
students’ needs, there have been several efforts to synthesize the knowledge on learner-
centered instruction. First, the American Psychological Association conducted wide-ranging
research to identify learner-centered psychological principles based on educational research
(American Psychological Association’s Board of Educational Affairs, 1997; Lambert &
McCombs, 1998). The report presents 12 principles and provides the research evidence that
supports each principle. It categorizes the psychological principles into four areas: (1)
cognitive and metacognitive, (2) motivational and affective, (3) developmental and social,
and (4) individual difference factors that influence learners and learning (see Table 3).
                                            459
              Foundations of Learning and Instructional Design Technology
                                           460
               Foundations of Learning and Instructional Design Technology
Another important line of research was carried out by the National Research Council to
synthesize knowledge about how people learn (Bransford et al., 1999). A two-year study was
conducted to develop a synthesis of new approaches to instruction that “make it possible for
the majority of individuals to develop a deep understanding of important subject matter” (p.
6). Their analysis of a wide range of research on learning emphasizes the importance of
customization and personalization in instruction for each individual learner, self-regulated
learners taking more control of their own learning, and facilitating deep understanding of
the subject matter. They describe the crucial need for, and characteristics of, learning
environments that are learner-centered and learning-community centered.
McCombs and colleagues (Baker, 1973; Lambert & McCombs, 1998; McCombs & Whisler,
1997) also address these new needs and ideas for instruction that supports all students.
They identify two important features of learner-centered instruction:
This twofold focus on learners and learning informs and drives educational decision-making
processes. In learner-centered instruction, learners are included in these educational
decision-making processes, the diverse perspectives of individuals are respected, and
learners are treated as co-creators of the learning process (McCombs & Whisler, 1997).
Personalized Learning
                                             461
               Foundations of Learning and Instructional Design Technology
helping each child to engage in the learning process in the most productive and meaningful
way to optimize each child’s learning and success. Personalized Learning was cultivated in
the 1970s by the National Association of Secondary School Principals (NASSP) and the
Learning Environments Consortium (LEC) International, and was adopted by the special
education movement. It is based upon a solid foundation of the NASSP’s educational
research findings and reports as to how students learn most successfully (Keefe, 2007;
Keefe & Jenkins, 2002), including a strong emphasis on parental involvement, more teacher
and student interaction, attention to differences in personal learning styles, smaller class
sizes, choices in personal goals and instructional methods, student ownership in setting
goals and designing the learning process, and technology use (Clarke, 2003). Leaders in
other fields, such as businessman Wayne Hodgins, have presented the idea that learning
will soon become personalized, where the learner both activates and controls her or his own
learning environment (Duval, Hodgins, Rehak, & Robson, 2004).
Differentiated Instruction
The recent movement in differentiated instruction is also a response to the need for a
learning-focused (as opposed to a sorting-focused) approach to instruction and education in
schools. Differentiated instruction is an approach that enables teachers to plan strategically
to meet the needs of every student. It is deeply grounded in the principle that there is
diversity within any group of learners and that teachers should adjust students’ learning
experiences accordingly (Tomlinson, 1999, 2001, 2003). This draws from the work of
Vygotsky (1986), especially his “zone of proximal development” (ZPD), and from classroom
researchers. Researchers found that with differentiated instruction students learned more
and felt better about themselves and the subject area being studied (Tomlinson, 2001).
Evidence further indicates that students are more successful and motivated in schools if
they learn in ways that are responsive to their readiness levels (Vygotsky, 1986), personal
interests, and learning profiles (Csikszentmihalyi, 1990; Sternberg, Torff, & Grigorenko,
1998). The goal of differentiated instruction is to address these three characteristics for
each student (Tomlinson, 2001, 2003).
Another area of study that gives us an understanding of how people learn is the work on
brain research which describes how the brain functions. Caine and colleagues (1997, 2005,
2006) provide a useful summary of work on how the brain functions in the process of
learning through the 12 principles of brain-based learning. Brain-based learning begins
when learners are encouraged to actively immerse themselves in their world and their
learning experiences. In a school or classroom where brain-based learning is being
practiced, the significance of diverse individual learning styles is taken for granted by
teachers and administrators (Caine & Caine, 1997). In these classrooms and schools,
learning is facilitated for each individual student’s purposes and meaning, and the concept
of learning is approached in a completely different way from the current classrooms that are
set up for sorting and standardization.
                                             462
               Foundations of Learning and Instructional Design Technology
Imagine that there are no grade levels for this school. Instead, each of the students strives
to master and check off their attainments in a personal “inventory of attainments”
(Reigeluth, 1994) that details the individual student’s progress through the district’s
required and optional learning standards, kind of like merit badges in Scouting. Each
student has different levels of progress in every attainment, according to his or her
interests, talents, and pace. The student moves to the next topic as soon as she or he
masters the current one. While each student must reach mastery level before moving on,
students also do not need to wait for others who are not yet at that level of learning. In
essence, now, the schools hold time constant and student learning is thereby forced to vary.
In this new paradigm of the learner-centered school, it is the pace (learning time) that
varies rather than student learning. All students work at their own maximum pace to reach
mastery in each attainment. This individualized, customized, and self-paced learning process
allows the school district to realize high standards for its students.
The teacher takes on a drastically different role in the learning process. She or he is a guide
or facilitator who works with the student for at least four years, building a long-term, caring
relationship (Reigeluth, 1994). The teacher’s role is to help the student and parents to
decide upon appropriate learning goals and to help identify and facilitate the best way for
the student to achieve those goals—and for the parents to support their student. Therefore,
each student has a personal learning plan in the form of a contract that is jointly developed
every two months by the student, parents, and teacher.
This system enhances motivation by placing greater responsibility and ownership on the
students, and by offering truly engaging, often collaborative work for students (Schlechty,
2002). Teachers help students to direct their own learning through the contract
development process and through facilitating real-world, independent or small-group
projects that focus on developing the contracted attainments. Students learn to set and
meet deadlines. The older the students get, the more leadership and assisting of younger
students they assume.
The community also works closely with schools, as the inventory of attainments includes
standards in service learning, career development, character development, interpersonal
skills, emotional development, technology skills, cultural awareness, and much more. Tasks
that are vehicles for such learning are authentic tasks, often in real community
environments that are rich for learning (Reigeluth, 1994). Most learning is interdisciplinary,
drawing from both specific and general knowledge and interpersonal and decision-making
skills. Much of the focus is on developing deep understandings and higher-order thinking
skills.
                                             463
               Foundations of Learning and Instructional Design Technology
Teachers assess students’ learning progress through various methods, such as computer-
based assessment embedded in simulations, observation of student performances, and
analysis of student products of various kinds. Instead of grades, students receive ratings of
“emerging,” “developing,” “proficient” (the minimum required to pass), or “expert.”
Each teacher has a cadre of students with whom she or he works for several years—a
developmental stage of their lives. The teacher works with 3–10 other teachers in a small
learning community (SLC) in which the learners are multi-aged and get to know each other
well. Students get to choose which teacher they want (stating their first, second, and third
choice), and teacher bonuses are based on the amount of demand for them. Each SLC has
its own budget, based mainly on the number of students it has, and makes all its own
decisions about hiring and firing of its staff, including its principal (or lead teacher). Each
SLC also has a school board made up of teachers and parents who are elected by their
peers.
Conclusion
Our society needs learner-centered schools that focus on learning rather than on sorting
(McCombs & Whisler, 1997; Reigeluth, 1997; Senge et al., 2000; Toffler, 1984). New
approaches to instruction and education have increasingly been advocated to meet the
needs of all learners, and a large amount of research has been conducted to advance our
understanding of learning and how the educational system can be changed to better support
it (Alexander & Murphy, 1993; McCombs & Whisler, 1997; Reigeluth, 1997; Senge et al.,
2000). Nevertheless, transforming school culture and structure is not an easy task.
Isolated reforms, typically at the classroom and school levels, have been attempted over the
past several decades, and their impact on the school system has been negligible. It has
become clear that transforming the paradigm of schools is not a simple job. Teachers,
administrators, parents, policy-makers, students, and all other stakeholder groups must
work together, as they cannot change such a complex culture and system alone. In order to
transform our schools to be truly learner-centered, a critical systems approach to
transformation is essential.
The first article in this series (Reigeluth & Duffy, 2008) described the FutureMinds
approach for state education departments to support this kind of change in their school
districts. The second article (Duffy & Reigeluth, 2008b) described the School System
                                              464
               Foundations of Learning and Instructional Design Technology
Transformation (SST) Protocol, a synthesis of current knowledge about how to help school
districts use a critical systems approach to transform themselves to the learner-centered
paradigm of education. Hopefully, with state leadership through FutureMinds, the critical
systems approach to educational change in the SST Protocol, and the new knowledge about
learner-centered instruction, we will be able to create a better place for our children to
learn and grow. However, this task will not be easy. One essential ingredient for it to
succeed is the availability of powerful tools to help teachers and students in the learner-
centered paradigm. The fourth article in this series will address this need.
Application Exercises
          Review the author’s theoretical learner centered school. What do you see as
          the strengths of this format? What are its weaknesses?
          The authors of this article suggest giving students authentic tasks in the
          community to help them achieve their academic goals. What authentic,
          community project would you have designed for yourself as a high school
          student? Now?
          Do a little bit of research and share what tools are available to aid instructors
          in becoming more learner centric. What limitations do these tools have? What
          do they do well? What factors of the learner environment must change to
          make these tools more effective?
          How would you design a learner-centered school that may be different from
          the version that are discussed in this article?
References
Ackoff, R. L. (1981). Creating the corporate future. New York: John Wiley & Sons.
Alexander, P. A., & Murphy, P. K. (1993). The research base for APA’s learner-centered
psychological principals. In B. L. McCombs (Chair), Taking research on learning seriously:
Implications for teacher education. Invited symposium at the Annual Meeting of the
American Psychological Association, New Orleans.
                                             465
               Foundations of Learning and Instructional Design Technology
Banathy, B. H. (1992). A systems view of education: Concepts and principles for effective
practice. Englewood Cliffs, NJ: Educational Technology Publications.
Battino, W., & Clem, J. (2006). Systemic changes in the Chugach School District.
TechTrends, 50(2), 51–52.
Bransford, J., Brown, A., & Cocking, R. (Eds.). (1999). How people learn: Brain, mind,
experience, and school.Washington, DC: National Academy Press.
Caine, R. N. (2005). 12 brain/mind learning principles in action: The fieldbook for making
connections, teaching, and the human brain. Thousand Oaks, CA: Corwin Press.
Caine, R. N., & Caine, G. (1997). Education on the edge of possibility. Alexandria, VA:
Association for Supervision & Curriculum Development.
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper
& Row.
Duffy, F. M., & Reigeluth, C. M. (2008). The school system transformation (SST) protocol.
Educational Technology, 48(4), 41–49.
Duval, E., Hodgins, W., Rehak, D., & Robson, R. (2004). Learning objects symposium special
issue guest editorial. Journal of Educational Multimedia and Hypermedia, 13(4), 331.
Flood, R. L. (1990). Liberating systems theory: Toward critical systems thinking. Human
Relations, 43(1), 49–75.
Flood, R. L., & Jackson M. C. (1991). Creative problem solving: Total systems intervention.
New York: John Wiley and Sons.
Habermas, J. (1973). Theory and practice (J. Viertel, Trans.). Boston: Beacon Press.
Habermas, J. (1984). The theory of communicative action: Reason and the rationalization of
society (T. McCarthy, Trans.). Boston: Beacon Press.
                                             466
               Foundations of Learning and Instructional Design Technology
Habermas, J. (1987). The theory of communicative action: Lifeworld and system: A critique
of functional reason (T. McCarthy, Trans.). Boston: Beacon Press.
Hammer, M., & Champy, J. (1993). Reengineering the corporation: A manifesto for business
revolution. New York: Harper Business.
Hammer, M., & Champy, J. (2003). Reengineering the corporation: A manifesto for business
revolution (1st Harper Business Essentials pbk. ed.). New York: Harper Business Essentials.
Hannum, W. H., & McCombs, B. L. (2008). Enhancing distance learning for today’s youth
with learner-centered principles. Educational Technology, 48(3), 11–21.
Jackson, M. C. (1985). Social systems theory and practice: The need for a critical approach.
International Journal of General Systems, 10(3), 135–151.
Jackson, M. C. (1991a). The origins and nature of critical systems thinking. Systems
Practice, 4(2), 131–149.
Jenlink, P. M., Reigeluth, C. M., Carr, A. A., & Nelson, L. M. (1996). An expedition for
change. TechTrends, 41(1), 21–30.
Jenlink, P. M., Reigeluth, C. M., Carr, A. A., & Nelson, L. M. (1998). Guidelines for
facilitating systemic change in school districts. Systems Research and Behavioral Science,
15(3), 217–233.
Keefe, W., & Jenkins, J. (2002). A special section on personalized instruction. Phi Delta
Kappan, 83(6), 440–448.
Lambert, N. M., & McCombs, B. (Eds.). (1998). How students learn: Reforming schools
through learner-centered education. Washington, DC: American Psychological Association.
McCombs, B., & Whisler, J. (1997). The learner-centered classroom and school. San
Francisco: Jossey-Bass.
                                             467
               Foundations of Learning and Instructional Design Technology
Reigeluth, C. M., & Duffy, F. M. (2008). The AECT FutureMinds initiative: Transforming
America’s school systems. Educational Technology, 48(3), 45–49.
Schlechty, P. C. (1997). Inventing better schools: An action plan for educational reform. San
Francisco: Jossey-Bass.
Schlechty, P. C. (2002). Working on the work: An action plan for teachers, principals, and
superintendents. San Francisco: Jossey-Bass.
Senge, P., Cambron-McCabe, N., Lucas, T., Smith, B., Dutton, J., & Kleiner, A. (2000).
Schools that learn: A fifth discipline fieldbook for educators, parents, and everyone who
cares about education. Toronto: Currency.
Sternberg, R., Torff, B., & Grigorenko, E. (1998). Teaching triarchically improves student
achievement. Journal of Educational Psychology, 90(3), 374–384.
Vygotsky, L. (1986). Thought and language (A. Kozulin, Trans.). Cambridge, MA: MIT Press
(original work published 1926).
Watson, S. L., Watson. W., & Reigeluth, C. M. (2008). Systems design for change in
education and training. In J. M. Spector, M. D. Merrill, J. van Merriẽnboer, & M. P.
Driscoll (Eds.), Handbook of research for educational communications and technology (3rd
ed.). New York: Routledge/Lawrence Erlbaum Associates.
                                             468
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                       469
                        Sunnie Lee Watson
Dr. Sunnie Lee Watson teaches and conducts scholarly work in the field of
learner-centered paradigm of education. Her areas of research focus on
attitudinal learning and mindset change for social justice in both formal and
informal educational settings, learner-centered online instruction and innovative
educational technologies, and critical systems thinking for educational change.
She is currently a faculty member at Purdue University. (e-mail:
sunnieleewatson@purdue.edu).
                                      470
                      Charles M. Reigeluth
                                      471
                                            35
Distance Learning
Use of online and blended learning continues to grow in higher education. As of 2015,
approximately 70% of degree- granting institutions have some online offerings (Allen &
Seaman, 2015). Research in online learning has been conducted at micro and macro levels.
Micro level research has been conducted at the course or individual case study level,
investigating variables such as effective instructional strategies or demographic profiles of
successful learners in these environments. Macro level research has been conducted at the
national or global levels, investigating access to education via free online courses such as
Massively Open Online Courses, otherwise known as MOOCs, and examining global
standards for online learning.
This chapter explores several research trends in order to assess the state of online learning
and identify opportunities for future research. In order to better understand the research
trends, definitions are presented first followed by quality standards for online learning
courses, and programs developed by professional organizations are summarized. Student,
faculty, and administrator perceptions of online learning are reviewed in addition to best
practices in design and implementation in online learning. Best practices regarding faculty
and learner support are also discussed. Finally, the chapter concludes with a list of
academic journals dedicated to online learning research, and a review of trends in online
learning to watch.
                                             472
               Foundations of Learning and Instructional Design Technology
Distance education and online learning are terms that are often used interchangeably.
However, online learning and its components are encompassed within distance education,
which contains two components that are not representative of online learning:
correspondence courses and satellite campuses. Figure 1 is a visual representation of the
delivery methods of distance education.
                                            473
               Foundations of Learning and Instructional Design Technology
Daniel and Uvalic-Trumbic (2013 ) in their review of quality online learning standards list
institutional support (vision, planning, and infrastructure), course development, teaching
and learning (instruction), course structure, student support, faculty support, technology,
evaluation, student assessment, and examination security as elements essential for quality
                                            474
               Foundations of Learning and Instructional Design Technology
online learning. They also add that to assure quality online learning in higher education the
most essential requirement is the institutional vision, commitment, leadership, and sound
planning.
Martin, Polly, Jokiaho, and May (2017) on reviewing twelve different global standards for
online learning found that the number of standards varied in these documents from 17 to
184 (Table 21). Instructional analysis, design, and development (N=164); student attributes,
support, and satisfaction (N=115); and institutional mission, structure, and support (N=102)
were the top categories. Course facilitation, implementation, and dissemination (N=40);
policies and planning (N=33); and faculty support and satisfaction (N=27) were rated the
lowest three.
Table 2. Standard Details (Name, Year, Sponsor, Number of Sections and Number of
Standards). Used with permission from Martin, Polly, Jokiaho & May (2017).
                                                                            Number Number
Standard Name                                 Year          Sponsor         of       of
                                                                            Sections Standards
                                                            Institute for
                                                            Higher Ed
Quality on the Line: Benchmarks for
                                                            Policy,
Success in Internet Based Distance            2000                          7       24
                                                            supported by
Education
                                                            NEA and
                                                            Blackboard
Open eQuality Learning Standards
(Canada),
                                              2004          Canada          4       25
http://www.eife-l.org/publications/quality/
oeqls/intro
Online Learning Consortium (Formerly                        OLC
                                              2005                          8       75
Sloan-C) Quality Scorecard                                  Consortium
Blackboard Exemplary Rubric                   2000          Blackboard      4       17
                                              2015, 5th Quality
Quality Matters                                                             8       45
                                              edition   Matters
                                                         Council for
CHEA Institute for Research and Study of      2002       Higher
                                                                       7            7
Accreditation and Quality Assurance           revision 1 Education
                                                         Accreditation
                                              2005
                                              revision of
NADEOSA (South Africa)                                                      13      184
                                              1996
                                              document
                                              475
               Foundations of Learning and Instructional Design Technology
                                                           Australasian
                                                           Council on
ACODE (The Australasian Council on Open,
                                         2014              Open,          8        64
Distance and e-learning)
                                                           Distance and
                                                           e-learning
                                                           Asian
AAOU (Asian Association of Open                            Association of
                                              no date                     10       54
Universities)                                              Open
                                                           Universities
ECBCheck                                      2012                        13       46
UNIQUe                                        2011                        10       71
International Organization for
                                              2005                        7        38
Standardization (ISO)
These three analyses of the quality standards and frameworks over time echo similar results
that institutional factors such as vision, support, and planning are important indicators of
quality online learning.
Student Perception
Table 3 summarizes the key perceptions of students on online learning, including benefits
and challenges.
                                             476
               Foundations of Learning and Instructional Design Technology
Table 4 summarizes the key perceptions of faculty on online learning, including benefits and
challenges.
                                             477
                Foundations of Learning and Instructional Design Technology
identified: (1) administrative issues, (2) social interaction, (3) academic skills, (4) technical
skills, (5) learner motivation, (6) time and support, (7) cost and internet access, and (8)
technical problems. Research in online course design and implementation has tried to
address these issues. One example is the development and research of the Community of
Inquiry framework (Garrison, Anderson, & Archer, 1999) which provides guidelines for
faculty and designers to create meaningful interactive learning experiences that increase
the level of social interaction.
Course Design
Instructors may have various levels of control over the design of the course structure,
depending on organizational philosophies. Lee, Dickerson, and Winslow (2012) defined
three approaches to faculty control of course structure: fully autonomous, basic guidelines,
and highly specified. When faculty have less control of their course design, the courses are
designed by the institution with instructors serving more as facilitators. Regardless of the
amount of faculty control, there are basic elements to course structure that research has
shown to be effective such as a having a consistent course structure throughout the course
(Swan, 2001).
Gamification and the use of games, virtual worlds, and simulations have also gained traction
in the online learning research. Gamification is defined as the application of game design
elements, such as digital badges, in non-game contexts. Hamari et al. (2014) conducted a
literature review of gamification studies and found that gamification can have positive
effects, but those effects depended on the context in which the strategies were implemented
and the audience. For example, in the context of applying gamification in an educational
                                               478
               Foundations of Learning and Instructional Design Technology
Assessment affects how learners approach learning and the content as well as how learners
engage with one another and the instructor (Kolomitro & MacKenzie, 2017). Students
access course content based upon the belief that the course will help them learn and have
better outcomes (Murray, Perez, Geist, & Hedrick, 2012). Therefore the design of online
assessments should promote active learning and ensure that success depends on retaining
course content. Martin and Ndoye (2016) examined learner-centered assessment in online
learning and how instructors can use learning analytics to improve the design and delivery
of instruction to make it more meaningful. They demonstrated several data analytic
techniques that instructors can apply to provide feedback to students and to make informed
data- driven decisions during instruction as opposed to after instruction. Applying such
techniques can increase retention of online students.
Transactional distance theory defined the feeling of isolation or psychological distance that
online learners often experience (Moore, 1989). To lessen transactional distance, Moore
defined three types of interaction: (a) learner-to-learner, (b) learner-to-instructor, and (c)
learner-to-content to guide faculty to create quality distance education experiences. Bernard
et al. (2009) conducted a meta-analysis on 74 distance education studies on the effects of
Moore’s three types of interaction and found support for their importance for achievement.
The Community of Inquiry framework built upon these types of interaction and defined a
quality education experience for an online learner in terms of three overlapping presences:
cognitive, social, and teaching (Garrison, Anderson, & Archer, 1999). However, the
Community of Inquiry framework’s ability to create deep and meaningful learning
experiences has come into question because much of the research used self-reporting,
achievement, and perception measures (Rourke and Kanuka, 2009; Annand, 2011).
Another research lens used to address online learner isolation is learner engagement.
Engagement in any learning is important. However in online learning engagement is more
important because online learners have fewer chances to interact with each other, the
instructor, and the institution. Chickering and Gamson (1987) proposed a framework
                                             479
               Foundations of Learning and Instructional Design Technology
Chickering and Gamson (1987)                Graham, Cagiltay, Lim, Craner, & Duffy (2001)
Increases the contact between student
                                            Provides clear interaction expectations
and faculty
Provides opportunities for students to      Facilitates meaningful cooperation through well-
work in cooperation                         designed assignments
Encourages students to use active
                                            Requires course project presentations
learning strategies
Provides timely feedback on students’       Provides information and acknowledgment
academic progression                        feedback
Requires students to spend quality time     Uses deadlines and milestones to keep students
on academic tasks                           on track
                                            Creates challenging tasks and case studies, and
Communicates high expectations
                                            communicates positive feedback for quality work
Addresses different learner needs in the    Allows students to choose topics for assessments
learning process                            in order to incorporate diverse views
More recently, Dixon (2010) created and validated a scale to measure online learner
engagement. The instrument was used to survey 186 online learners from six different
campuses. Results showed that multiple communication channels or meaningful and
multiple ways of interaction may result in higher learner engagement. However, more
research should be conducted to validate these results.
                                              480
               Foundations of Learning and Instructional Design Technology
Course Implementation
Muilenburg and Berge (2007) identified several issues related to online learning
implementation from the student perspective, including course materials that are not always
delivered on time, instructors not knowing how to teach online, lack of timely feedback, and
lack of access to instructor. Three of these deal specifically with instructor immediacy or
responsiveness. Bodie and Michel (2014) conducted an experimental study manipulating
immediacy strategies for 576 participants in an introductory psychology course. Results
revealed that learners in the high immediacy group showed greater learning gains and
retention. Martin, Wang and Sadaf (2017) investigated the effects of 12 different facilitation
strategies on instructor presence, connection, learning, and engagement. They found that
students perceived timely response to questions and feedback on assignments from
instructors helpful. It was also noted that instructors’ use of video aided in building a
connection with the instructor. Timeliness and immediacy are common themes in the
research. Again, more experimental research should be conducted to identify specific
strategies for faculty.
In addition, Oncu and Cankir (2011) identified four main research goals for course design
and implementation to address achievement, engagement, and retention issues in online
learning. The four goals are (1) learner engagement & collaboration, (2) effective
facilitation, (3) assessment techniques, and (4) designing faculty development. They further
recommended that experimental research be conducted to identify effective practices in
these areas. Thus, there are many frameworks and principles for effective design and
implementation of online learning, but there is still a lack of research validating many of
these ideas or providing effective cases.
Several universities who offer online courses are providing online course planning and
development support and technology support to their faculty, along with institutional
support.
Online teaching can be very demanding on faculty. A recent study found that online
teaching demanded 14% more time than traditional teaching and fluctuated considerably
during times of advising and assessment (Tomei, 2006). With the spread of online teaching
practices in higher education, many academic staff are faced with technological and
pedagogical demands that require skills they don’t necessarily possess (Weaver, Robbie, &
Borland, 2008). The quality of online programs depends upon the pedagogical practices of
online teachers; therefore, faculty support in online programs is very important (Baran &
Correia, 2014).
Some believe that the success of online teaching depends upon the support of faculty on
                                             481
               Foundations of Learning and Instructional Design Technology
three main levels: teaching, community, and organization (Baran & Correia, 2014). The
teaching level includes assistance with technology, pedagogy, and content through
workshops, training programs, and one-on-one assistance. The challenge here is often the
fact that academic staff find it hard to adapt to changes in their teaching or to allow
someone else to tell them how to teach. Therefore individuals who design online programs
need to first establish themselves as experts and to be viewed as such by faculty (Weaver,
Robbie & Borland, 2008).
The community level includes collegial learning groups, peer support programs, peer
observation, peer evaluation, and mentoring programs. Some have highlighted the
importance of creating a supportive community for online instructors who often feel isolated
(Eib & Miller, 2006). Building learning communities and communities of practice for online
teachers as well as providing opportunities for students and online faculty helps combat
feelings of isolation (Eib & Miller, 2006; Top, 2012).
The institutional level of support consists of rewards and recognition and the promotion of a
positive organizational culture towards online education (Baran & Correia, 2014, p. 97).
Institutional support is seen as supremely important (Baran & Correia, 2014; Weaver,
Robbie & Borland, 2008). On one hand, if the deans and department heads do not support
online teaching, the faculty who does may feel marginalized, unsupported within their
discipline, and isolated. On the other hand, if upper management adopts online teaching and
pushes for too many changes too quickly, planned implementation and adequate training
can be grossly neglected, resulting in dissatisfaction among academic staff (Weaver, Robbie
& Borland, 2008).
Learner Support
Community building in online classes has received more attention in recent years. Social
presence refers to “the strength of the social relationships and emotional connection among
the members of a class or learning community” (Rubin, 2013, p. 119). On an individual level,
social presence refers to how involved and engaged each individual student is in the
                                            482
               Foundations of Learning and Instructional Design Technology
community, and his or her motivation and drive to share, interact, and learn from others. On
a community level, social presence refers to the shared sense of belonging of the students in
the classroom. Teachers can influence social presence by designing group assignments,
creating discussion forums, rewarding community building behaviors and modeling
openness and sharing (Rubin, 2013). Teacher presence refers to designing learning
experiences, guiding and leading students’ work, providing feedback, and facilitating
interaction and community building (Rubin, 2013).
Within the context of learner support, providing accommodations and support for students
with disabilities is also an important consideration in online education. In particular, for
students with cognitive impairments, navigating an online course can be particularly
challenging, as existing platforms typically do not support such learners (Grabinger, Aplin &
Ponnappa-Brenner, 2008).
Additional Resources
Table 7. Journals focusing on Online Learning
                                              483
               Foundations of Learning and Instructional Design Technology
Application Exercises
References
Bodie, L. W., & Michel, M. B. (2014, June). An experimental study of instructor immediacy
and cognitive learning in an online classroom. In Intelligent Environments (IE), 2014
International Conference on (pp. 265-272). IEEE.
Bolliger, D. U., & Wasilik, O. (2009). Factors influencing faculty satisfaction with online
teaching and learning in higher education. Distance Education, 30(1), 103-116.
                                              484
               Foundations of Learning and Instructional Design Technology
doi:10.1080/01587910902845949
Borokhovski, E., Tamim, R., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are
contextual and designed student–student interaction treatments equally effective in distance
education?. Distance Education, 33(3), 311-329.
Clark, D. B., Tanner-Smith, E. E., & Killingsworth, S. S. (2016). Digital games, design, and
learning: A systematic review and meta-analysis. Review of Educational Research, 86(1),
79-122.
Eib, B. J., & Miller, P. (2006). Faculty development as community building–an approach to
professional development that supports communities of practice for online teaching.
International Review of Research in Open and Distance Learning, 7(2)
Friedman, J. (2017). Five Online Education Trends to Watch in 2017. Retrieved online from
https://edtechbooks.org/-IB
Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based
environment: Computer conferencing in higher education. The internet and Higher
Education, 2(2), 87-105.
Gaytan, J. (2015). Comparing faculty and student perceptions regarding factors that affect
student retention in online education. American Journal of Distance Education, 29(1), 56-66.
Grabinger, R. S., Aplin, C., & Ponnappa-Brenner, G. (2008). Supporting learners with
cognitive impairments in online environments. TechTrends, 52(1), 63-69.
Graham, C., Cagiltay, K., Lim, B. R., Craner, J., & Duffy, T. M. (2001). Seven principles of
effective teaching: A practical lens for evaluating online courses. The Technology Source,
30(5), 50.
Hamari, J., Koivisto, J., & Sarsa, H. (2014, January). Does gamification work?–a literature
review of empirical studies on gamification. In System Sciences (HICSS), 2014 47th Hawaii
International Conference on (pp. 3025-3034). IEEE.
                                             485
               Foundations of Learning and Instructional Design Technology
Hiltz, S. R., Shea, P., & Kim, E. (2007). Using focus groups to study ALN faculty motivation.
Journal of Asynchronous Learning Networks, 11(1), 110-124.
Hunt, D., Davis, K., Richardson, D., Hammock, G., Akins, M., & Russ, L. (2014). It is (more)
about the students: Faculty motivations and concerns regarding teaching online. Online
Journal of Distance Learning Administration, 17(2).
Ice, P., Curtis, R., Phillips, P., & Wells, J. (2007). Using Asynchronous Audio Feedback to
Enhance Teaching Presence and Students’ Sense of Community. Journal of Asynchronous
Learning Networks, 11(2), 3-25.
Kolomitro, K., & MacKenzie, L. W. (2017). Using Assessment to Promote Deep and Active
Learning in an Online Anatomy Course. The FASEB Journal, 31(1 Supplement), 584-2.
Lapointe, L., & Reisette, M. (2008). Belonging online: Students’ perceptions of the value and
efficacy of an online learning community. International Journal on E-Learning, 7, 641-665.
http://editlib.org/p/24419
Leasure, A., Davis, L., & Thievon, S. (2000). Comparison of student outcomes and
preferences in a traditional vs. world wide web-based baccalaureate nursing research
course. Journal of Nursing Education, 39, 149-154
Lee, C. Y., Dickerson, J., & Winslow, J. (2012). An analysis of organizational approaches to
online course structures. Online Journal of Distance Learning Administration, 15(1), n1.
Lieblein, E. (2000). Critical factors for successful delivery of online programs. Internet and
Higher Education, 3(3), 161-174.
Lister, M. (2014). Trends in the design of e-learning and online learning. Journal of Online
Learning and Teaching, 10(4), 671.
Lunt, T., & Curran, J. (2010). ‘Are you listening please?’ The advantages of electronic audio
feedback compared to written feedback. Assessment & Evaluation in Higher Education,
35(7), 759-769.
Mandernach, B. J., Hudson, S., & Wise, S. (2013). Where has the time gone? Faculty
activities and time commitments in the online classroom. Journal of Educators Online, 10(2),
n2.
                                              486
               Foundations of Learning and Instructional Design Technology
Martin, F. & Parker, M.A. (2014). Use of Synchronous Virtual Classrooms: Why, Who and
How? MERLOT Journal of Online Learning and Teaching, 10(2), 192-210.
Martin, F., Parker, M. A., & Deale, D. (2012). Examining the interactivity of synchronous
virtual classrooms. The International Review of Research in Open and Distance Learning,
13(3), 227-260
Martin, F., Polly, D., Jokiaho, A., & May, B. (accepted). Global standards for enhancing
quality in online learning. Quarterly Review of Distance Education.
Mayes, R., Luebeck, J., Ku, H. Y., Akarasriworn, C., & Korkmaz, Ö. (2011). Themes and
strategies for transformative online instruction: A review of literature and practice.
Quarterly Review of Distance Education, 12(3), 151.
Merchant, Z., Goetz, E. T., Cifuentes, L., Keeney-Kennicutt, W., & Davis, T. J. (2014).
Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and
higher education: A meta-analysis. Computers & Education, 70, 29-40.
Merry, S., & Orsmond, P. (2007, June). Students’ responses to academic feedback provided
via mp3 audio files. Science Learning and Teaching Conference (pp. 19-20).
Meyer, K. A. (2003). Face-to-face versus threaded discussions: The role of time and higher
order thinking. Journal of Asynchronous Learning Networks. 7, 55-65.
Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor
analytic study. Distance education, 26(1), 29-48.
Murray, M., Perez, J., Geist, D. & Hedrick, A. (2012). Student interaction with online course
content: Build it and they might come. Journal of Information Technology Education, 11,
125-139.
Oncu, S., & Cakir, H. (2011). Research in online learning environments: Priorities and
methodologies. Computers & Education, 57(1), 1098-1108.
Orr, R., Williams, M. R., & Pennington, K. (2009). Institutional efforts to support faculty in
online teaching. Innovative Higher Education, 34(4), 257.
Rockwell, S. K., Schauer, J., Fritz, S., & Marx, D. B. (1999). Incentives and obstacles
influencing higher education faculty and administrators to teach via distance. Faculty
                                              487
               Foundations of Learning and Instructional Design Technology
Rubin, B. (2013). Measuring the community in online classes. Online Learning, 17(3).
Schrum, L. (2002). Oh, What Wonders You Will See–Distance Education Past, Present, and
Future. Learning & Leading with Technology.
Schwartzman, R. (2007). Refining the question: How can online instruction maximize
opportunities for all students? Communication Education, 56, 113-117.
doi:10.1080/03634520601009728
Shelton, K. (2011). A review of paradigms for evaluating the quality of online education
programs. Online Journal of Distance Learning Administration, 14(1).
Smith, D.F. (2014). 10 Online Learning Trends to Watch in 2015. Retrieved online from
https://edtechbooks.org/-gG
Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and
perceived learning in asynchronous online courses. Distance Education, 22(2), 306-331.
Tomei, L. A. (2006). The impact of online teaching on faculty load: Computing the ideal class
size for online courses. Journal of Technology and Teacher Education, 14(3), 531-541.
Volery, T., & Lord, D. (2000). Critical success factors in online education. International
Journal of Educational Management, 14(5), 216-223.
Weaver, D., Robbie, D., & Borland, R. (2008). The practitioner’s model: Designing a
professional development program for online teaching. International Journal on ELearning,
7(4), 759-774.
                                              488
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                        489
                          Florence Martin
                                     490
                            Beth Oyarzun
                                     491
                                       37
David Wiley
Editor’s Note
The following was submitted by David Wiley as a good introduction to his thoughts
on open educational resources and is a preprint of an essay set to appear in Bonk,
Lee, Reeves, and Reynolds’s book, MOOCs and Open Education around the World.
It may have undergone additional editing before publication. This essay remixes
some material that was previously published on Wiley’s website, opencontent.org
[http://opencontent.org/] and is available at https://edtechbooks.org/-dB.
Wiley, D. (2014, September 18). The MOOC misstep and the open education
infrastructure [Web log post]. Retrieved from https://edtechbooks.org/-Dm
                                       492
                Foundations of Learning and Instructional Design Technology
Ted Talk
    For additional learning from Wiley about open educational resources and their
    relevance to education, see his TEDx Talk.
In this piece I briefly explore the damage done to the idea of “open” by MOOCs, advocate
for a return to a strengthened idea of “open,” and describe an open education infrastructure
on which the future of educational innovation depends.
                                                493
                Foundations of Learning and Instructional Design Technology
historical context.
The openness of the Open University of the UK, first established in 1969 and admitting its
first student in 1971, was an incredible innovation in its time. In this context, the adjective
“open” described an enlightened policy of allowing essentially anyone to enroll in courses at
the university—regardless of their prior academic achievement. For universities, which are
typically characterized in metaphor as being comprised of towers, silos, and walled gardens,
this opening of the gates to anyone and everyone represented an unprecedented leap
forward in the history of higher education. For decades, “open” in the context of education
primarily meant “open entry.”
While there are dozens of universities around the world that have adopted an open entry
policy, in the decade from 2001–2010 open education was dominated by individuals,
organizations, and schools pursuing the idea of open in terms of open licensing. Hundreds of
universities around the globe maintain opencourseware programs. The open access
movement, which found its voice in the 2002 Budapest Open Access initiative, works to
apply open licenses to scholarly articles and other research outputs. Core learning
technology infrastructure, including Learning Management Systems, Financial Management
Systems, and Student Information Systems are created and published under open licenses
(e.g., Canvas, Moodle, Sakai, Kuali). Individuals have begun contributing significantly to the
growing collection of openly licensed educational materials, like Sal Khan who founded the
Khan Academy. Organizations like the William and Flora Hewlett Foundation are pouring
hundreds of millions of dollars into supporting an idea of open education grounded in the
idea of open licensing. In fact, the Hewlett Foundation’s definition of “open educational
resources” is the most widely cited:
      OER are teaching, learning, and research resources that reside in the public
      domain or have been released under an intellectual property license that
      permits their free use and re-purposing by others. Open educational resources
      include full courses, course materials, modules, textbooks, streaming videos,
      tests, software, and any other tools, materials, or techniques used to support
      access to knowledge (Hewlett, 2014).
According to Creative Commons (2014), there were over 400 million openly licensed
                                             494
               Foundations of Learning and Instructional Design Technology
creative works published online as of 2010, and many of these can be used in support of
learning.
Imagine you’re planning to experiment with a new educational model. Now imagine two
ways this experiment could be conducted. In the first model, you pay exorbitant fees to
temporarily license (never own) digital content from Pearson, and you pay equivalent fees to
temporarily license (never own) Blackboard to host and deliver the content. In a second
model, you utilize freely available open educational resources delivered from inside a free,
open source learning management system. The first experiment cannot occur without
raising venture capital or other significant funding. The second experiment can be run with
almost no funding whatsoever. If we wish to democratize innovation, as von Hippel (2005)
has described it, we would do well to support and protect our ability to engage in the second
model of experimentation. Open licenses provide and protect exactly that sort of
experimental space.
Which brings us back to MOOCs. The horrific corruption perpetrated by the Udacity,
Coursera, and other copycat MOOCs is to pretend that the last forty years never happened.
Their modus operandi has been to copy and paste the 1969 idea of open entry into online
courses in 2014. The primary fallout of the brief, blindingly brilliant popularity of MOOCs
was to persuade many people that, in the educational context, “open” means open entry to
courses which are not only completely and fully copyrighted, but whose Terms of Use are
more restrictive than that of the BBC or New York Times. For example, consider this
selection from the Coursera Terms of Use:
     You may not take any Online Course offered by Coursera or use any Statement
     of Accomplishment as part of any tuition-based or for-credit certification or
     program for any college, university, or other academic institution without the
     express written permission from Coursera. Such use of an Online Course or
     Statement of Accomplishment is a violation of these Terms of Use.
The idea that someone, somewhere believes that open education means “open entry to fully
                                            495
               Foundations of Learning and Instructional Design Technology
copyrighted courses with draconian terms of use” is beyond tragic. Consequently, after a
decade of progress has been reversed by MOOCs, advocates of open education once again
find ourselves fighting uphill to establish and advance the idea of “open.” The open we
envision provides just as much access to educational opportunity as the 1960s vision
championed by MOOCs, while simultaneously enabling a culture of democratized,
permissionless innovation in education.
   1. Retain – the right to make, own, and control copies of the work (e.g., download,
      duplicate, store, and manage)
   2. Reuse – the right to use the work in a wide range of ways (e.g., in a class, in a study
      group, on a website, in a video)
   3. Revise – the right to adapt, adjust, modify, or alter the work itself (e.g., translate it
      into another language)
   4. Remix – the right to combine the original or revised work with other open works to
      create something new (e.g., incorporate the work into a mashup)
   5. Redistribute – the right to share copies of the original work, your revisions, or your
      remixes with others (e.g., give a copy of the work to a friend)
These 5R permissions, together with a clear statement that they are provided for free and in
perpetuity, are articulated in many of the Creative Commons licenses. When you download a
video from Khan Academy, some lecture notes from MIT OpenCourseWare, an article from
Wikipedia, or a textbook from OpenStax College—all of which use a Creative Commons
license—you have free and perpetual permission to engage in the 5R activities with those
materials. Because they are published under a Creative Commons license, you don’t need to
call to ask for permission and you don’t need to pay a license fee. You can simply get on with
the business of supporting your students’ learning. Or you can conduct some other kind of
teaching and learning experiment—and you can do it for free, without needing additional
permissions from a brace of copyright holders.
How would a change in the operational definition of “open” affect the large MOOC
providers? If MOOC providers changed from “open means open entry” to “open means open
licenses” what would the impact be? Specifically, if the videos, assessment, and other
content in a Coursera or Udacity MOOC were openly licensed would it reduce the “massive”
access that people around the world have to the courses? No. In fact, it would drastically
expand the access enjoyed by people around the world, as learners everywhere would be
                                               496
               Foundations of Learning and Instructional Design Technology
free to download, translate, and redistribute the MOOC content. MOOCs could become part
of the innovation conversation.
Despite an incredible lift-off thrust comprised of hype and investment, MOOCs have failed to
achieve escape velocity. Weighed down by a strange 1960s-meets-the-Internet philosophy,
MOOCs have started to fall back to earth under the pull of registration requirements, start
dates and end dates, fees charged for credentials, and draconian terms of use. It reminds
me of the old joke, “What do you call a MOOC where you have to register, wait for the start
date in order to begin, get locked out of the class after the end date, have no permission to
copy or reuse the course materials, and have to pay to get a credential?” “An online class.”
Despite all the hyperbole, it has become clear that MOOCs are nothing more than
traditional online courses enhanced by open entry, and not the innovation so many had
hoped for. Worse than that, because of their retrograde approach to “open,” MOOCs are
guaranteed to be left by the wayside as future educational innovation happens because it is
simply too expensive to run a meaningful number of experiments in the MOOC context.
Where will the experiments that define the future of teaching and learning be conducted,
then? Many of them will be conducted on top of what I call the open education
infrastructure.
Content as Infrastructure
The Wikipedia entry on infrastructure (Wikipedia, 2014) begins:
The term typically refers to the technical structures that support a society, such as roads,
bridges, water supply, sewers, electrical grids, telecommunications, and so forth, and can be
defined as “the physical components of interrelated systems providing commodities and
services essential to enable, sustain, or enhance societal living conditions.” Viewed
functionally, infrastructure facilitates the production of goods and services.
I can’t imagine a way to conduct a program of education without all four of the following
                                            497
               Foundations of Learning and Instructional Design Technology
Not everyone has the time, resources, talent, or inclination to completely recreate
competency maps, textbooks, assessments, and credentialing models for every course they
teach. As in the discussion of permissionless, democratized innovation above, it simply
makes things faster, easier, cheaper, and better for everyone when there is high quality,
openly available infrastructure already deployed that we can remix and experiment upon.
Historically, we have only applied the principle of openness to one of the four components of
the education infrastructure I listed above: educational resources, and I have been arguing
that “content is infrastructure” (Wiley, 2005) for a decade now. More recently, Mozilla has
created and shared an open credentialing infrastructure through their open badges work
(Mozilla, 2014). But little has been done to promote the cause of openness in the areas of
competencies and assessments.
Open Competencies
I think one of the primary reasons competency-based education (CBE) programs have been
so slow to develop in the US – even after the Department of Education made its federal
financial aid policies friendlier to CBE programs – is the terrific amount of work necessary
to develop a solid set of competencies. Again, not everyone has the time or expertise to do
this work. Because it’s so hard, many institutions with CBE programs treat their
competencies like a secret family recipe, hoarding them away and keeping them fully
copyrighted (apparently without experiencing any cognitive dissonance while they promote
the use of OER among their students). This behavior has seriously stymied growth and
innovation in CBE in my view.
If an institution would openly license a complete set of competencies, that would give other
institutions a foundation on which to build new programs, models, and other experiments.
The open competencies could be revised and remixed according to the needs of local
programs, and they can be added to, or subtracted from, to meet those needs as well. This
act of sharing would also give the institution of origin an opportunity to benefit from
remixes, revisions, and new competencies added to their original set by others.
Furthermore, openly licensing more sophisticated sets of competencies provides a public,
transparent, and concrete foundation around which to marshal empirical evidence and build
supported arguments about the scoping and sequencing of what students should learn.
Open competencies are the core of the open education infrastructure because they provide
the context that imbues resources, assessments, and credentials with meaning—from the
                                            498
               Foundations of Learning and Instructional Design Technology
perspective of the instructional designer, teacher, or program planner. (They are imbued
with meaning for students through these and additional means.) You don’t know if a given
resource is the “right” resource to use, or if an assessment is giving students an opportunity
to demonstrate the “right” kind of mastery, without the competency as a referent. (For
example, an extremely high quality, high fidelity, interactive chemistry lab simulation is the
“wrong” content if students are supposed to be learning world history.) Likewise, a
credential is essentially meaningless if a third party like an employer cannot refer to the
skill or set of skills its possession supposedly certifies.
Open Assessments
For years, creators of open educational resources have declined to share their assessments
in order to “keep them secure” so that students won’t cheat on exams, quizzes, and
homework. This security mindset has prevented sharing of assessments.
Because performance assessments are so difficult to cheat on, keeping them secure can be
less of a concern, making it possible for performance assessments to be openly licensed and
publicly shared. Once they are openly licensed, these assessments can be retained, revised,
remixed, reused, and redistributed.
Another way of alleviating concerns around the security of assessment items is to create
openly licensed assessment banks that contain hundreds or thousands of assessments – so
many assessments that cheating becomes more difficult and time consuming than simply
learning.
      Open Credentials
      Open Assessments
      Open Educational Resources
      Open Competencies
This interconnected set of components provides a foundation that will greatly decrease the
                                             499
               Foundations of Learning and Instructional Design Technology
time, cost, and complexity of the search for more effective models of education. (It will
provide related benefits for informal learning, as well). From the bottom up, open
competencies provide the overall blueprint and foundation, open educational resources
provide a pathway to mastering the competencies, open assessments provide the
opportunity to demonstrate mastery of the competencies, and open credentials which point
to both the competency statements and results of performance assessments certify to third
parties that learners have in fact mastered the competency in question.
When open licenses are applied up and down the entire stack—creating truly open
credentials, open assessments, open educational resources, and open competencies,
resulting in an open education infrastructure—each part of the stack can be altered,
adapted, improved, customized, and otherwise made to fit local needs without the need to
ask for permission or pay licensing fees. Local actors with local expertise are empowered to
build on top of the infrastructure to solve local problems. Freely.
Creating an open education infrastructure unleashes the talent and passion of people who
want to solve education problems but don’t have time to reinvent the wheel and rediscover
fire in the process.
“Openness facilitates the unexpected.” We can’t possibly imagine all the incredible ways
people and institutions will use the open education infrastructure to make incremental
improvements or deploy novel innovations from out of left field. That’s exactly why we need
to build it, and that’s why we need to commit to a strong conceptualization of open,
grounded firmly in the 5R framework and open licenses.
Application Exercises
          After reading the chapter share your thoughts on the theory that MOOCs
          have damaged the use of “open” resources.
          Describe a contribution you could make to open educational resources.
References
Coursera. (2014). Terms of Use. https://edtechbooks.org/-cb
                                            500
               Foundations of Learning and Instructional Design Technology
Suggested Citation
                                             501
                              David Wiley
Dr. David Wiley is the chief academic officer of Lumen Learning, an organization
offering open educational resources designed to increase student access and
success. Dr. Wiley has founded or co-founded numerous entities, including
Lumen Learning, Mountain Heights Academy (an open high school), and
Degreed. He was named one of the 100 Most Creative People in Business by
Fast Company, currently serves as Education Fellow at Creative Commons, and
leads the Open Education Group in Brigham Young University’s instructional
psychology and technology graduate program. He has been a Shuttleworth
Fellow, served as a Fellow of Internet and Society at Stanford Law School, and
was a Fellow of Social Entrepreneurship at BYU’s Marriott School of
Management.
                                      502
                                            38
Editor’s Note:
    Rieber, L. P., Smith, L., & Noah, D. (1998). The value of serious play. Educational
    Technology, 38(6), 29–37.
     Two eight-year old children are building a shopping mall with Legos on a
     Saturday afternoon. One is working on the entrance way and the other is
     working on two of the mall stores. As the model gets more elaborate, they see
     that they will soon run out of blocks if they wish to build the mall according to
     their grand design. They decide to change their strategy and build instead just
     the entrance way, but with doorways to the stores. They decide they can later
     use some old shoe boxes for the stores. They tear apart the stores already built
     and begin building the mall’s entrance way collaboratively with renewed vigor.
     They even go and get some small house plants and put them in the middle as
     “trees.” They continue working for the rest of the afternoon and into the early
     evening. The mother of one of the children calls to say it’s time to come home
     for dinner. A bit aggravated by this interruption, the friend agrees to come back
     tomorrow to help finish the model.
     A multimedia design team is busy developing the company’s latest CD-ROM. The
     team’s two graphic artists, Jean and Pat, have been trying to learn a new 3-D
     graphics application for use on the project. While both have been learning the
     tool separately on their own, they decide to work together after lunch one day.
                                            503
                Foundations of Learning and Instructional Design Technology
      Both soon discover that the other has learned some very different things. Both
      decide to work on a clown figure that Jean began earlier in the week. As they try
      to learn all of the tricks of the package, the clown figure starts to look ridiculous
      and both can’t help laughing at the “monster” they have created. However, they
      fail to figure out how to access the animation features of the software. Before
      they know it, it’s almost 7:00 p.m. and they decide to call it a day. Later that
      night at home, Pat makes a breakthrough on the package and e-mails Jean about
      it, describing some key ideas they should discuss the next day. Although it’s
      almost midnight, Pat’s phone rings. It’s Jean. The e-mail note had just arrived
      and it turns out that Jean had been working on the same problem at home as
      well. Both laugh and look forward to seeing what the other has discovered the
      next day.
What do these two situations have in common? At first glance, very little. The first deals
with children entertaining themselves with a favorite toy and the second with highly skilled
professionals working on an expensive project for work. However, one soon sees some
important similarities. Both stories show people engaged—engrossed—in an activity. All are
willing to commit great amounts of time and energy. Indeed, all are unaware of the amount
of time transpired, yet none would rather be doing anything else. All go to extraordinary
lengths to get back to the activity. Despite the obvious intense efforts, false starts, and
frustrations, all seem to be greatly enjoying themselves, as evidenced by the fact that no one
is forcing them to spend free time on the activities. The children’s project isn’t intended to
help them on upcoming tests at school, but it would be a mistake to think they are not
learning anything. Likewise the graphic designers are not thinking about being “tested” on
the graphics package and while probably not willing to share the clown graphic with their
boss, they recognize that this “fun experience” is essential to learning the 3-D graphics
software they need to use on the project. Both groups talk about their projects as work, yet
not the kind filled with drudgery and tedium, but the kind of work leading to satisfaction
and a sense of accomplishment. Of course, there is another word that describes the two
groups’ efforts—play.
Yes, play. We have found no better word to describe that special kind of intense learning
experience in which both adults and children voluntarily devote enormous amounts of time,
energy and commitment and at the same time derive great enjoyment from experience. We
call this serious play to distinguish it from other interpretations which may have negative
connotations. For example, while most accept the word play to describe many children’s
activities, adults usually bristle at the thought of using it to describe what they do. It is true
that the majority of research conducted to date on play has been with children and if used
or interpreted in the wrong way or wrong context it seems to cheapen or degrade a learning
experience. We, too, would probably run for the door if a trainer or instructor started
gushing about playing and having fun. But we argue that the same characteristics of
children’s play also extend well to adults[1] [#footnote-229-1] (see Colarusso, 1993; Kerr &
Apter, 1991).
                                               504
               Foundations of Learning and Instructional Design Technology
The purpose of this article is to propose serious play as a suitable goal or characteristic for
those learning situations demanding creative higher-order thinking and a strong sense of
personal commitment and engagement. Teachers, instructional designers, and trainers
should not shy away from encouraging or expecting play behavior in their students. We go
even further to suggest that those learning environments that conjure up serious play in
children or adults deserve special recognition. They are doing something right, and that
“something” involves a complex set of conditions.
We feel the time is ripe to seriously consider play given the current state of instructional
technology. The field has struggled philosophically over the past two decades, first with the
transition from a behavioral to a cognitive model of learning (Burton, Moore & Magliaro,
1996; Winn & Snyder, 1996), and more recently with reconciling the value and relevance of
constructivist orientations to learning in a field dominated by instructional systems design
(Duffy & Cunningham, 1996; Grabinger, 1996). At the same time, the field has witnessed
remarkable advances in computer technology. The time has come to apply what we know
about learning, motivation, and working cooperatively given the incredible processing
power and social connectivity of computers. We feel that play is an ideal construct for
linking human cognition and educational applications of technology given its rich
interdisciplinary history in fields such as education, psychology, epistemology, sociology,
and anthropology, and its obvious compatibility with interactive computer-based learning
environments, such as microworlds, simulations, and games.
Reflection
    Can you think of an experience you’ve had similar to Jean and Pat’s (described at
    the beginning of the article)? What was that experience? How does your experience
    support (or refute) the claims of the article?
                                              505
               Foundations of Learning and Instructional Design Technology
The commonsense tendency to define play as the opposite of work makes it easy to be
skeptical that play is a valid characterization for adult behaviors. However, Blanchard
(1995) describes a simple model of human activity drawn from anthropology that shows a
more accurate relationship between play and work, as illustrated in Figure 1. This model
has two dimensions, pleasurability and purposefulness, with play and work being orthogonal
constructs. The purposeful dimension defines a continuum with work and leisure at opposite
ends. Work has a purposeful goal, whereas leisure does not. Interestingly, Blanchard
contends that the English language does not have a word describing the opposite of play, so
the word “not-play” is used to define opposites on the pleasurability dimension.
                                              506
               Foundations of Learning and Instructional Design Technology
The four quadrants of the model encompass the full range of human activities. Quadrant A
(playful work) defines the “holy grail” of occupations—getting paid to do a job that is also
satisfying and rewarding. Quadrant C (not-play work), on the other hand, includes types of
work that are not enjoyable, but are done due to obligations or financial necessity. Quadrant
B (playing at leisure) includes those leisure activities that people devote deliberate effort to,
usually over extended periods of times, such as serious hobbies or avocations. These are
activities in which people grow intellectually, emotionally, or physically, such as gardening,
reading, cycling, or chess. Finally, Quadrant D (not-play leisure) includes those times or
activities, technically defined as “leisure,” when we find ourselves bored, unsatisfied, and
with nothing to do (e.g. sitting in front of the television looking for something interesting to
watch). The model applies readily to the adult world of work and leisure, but also
appropriately describes school settings (for both children and adults) when you consider
school to be a “job.” The goals for work (Quadrants A and C) are external to the individual
whereas the goals for leisure (Quadrants B and D) are internal.
A person who attains maximum pleasurability (in either Quadrant A or B) could also be
described as being in a state of “flow.” Flow theory, developed by Mihaly Csikszentmihalyi
(1979; 1990), derives its name from the way people describe a certain state of happiness
                                              507
               Foundations of Learning and Instructional Design Technology
and satisfaction. They are so absorbed that they report being carried by the “flow” of the
activity in an automatic and spontaneous way. Experiencing flow is an everyday occurrence,
though Csikszentmihalyi is careful to point out that attaining flow demands considerable
and deliberate effort and attention. Flow has many qualities and characteristics, the most
notable of which are the following: optimal levels of challenge; feelings of complete control;
attention focused so strongly on the activity that feelings of self-consciousness and
awareness of time disappear. Think to yourself of times that you were so engrossed in an
activity that you were shocked to learn that several hours had passed without your
knowledge. The “work” involved at attaining flow comes from maintaining a balance
between anxiety and challenge. As your experience and skill increases, you look for ways to
increase the challenge, but if you try something beyond your capability you quickly become
anxious. Flow can only be achieved by successfully negotiating and balancing challenge and
anxiety.
Reflection
    Can you think of experiences that you’ve had that fall into each of the four
    quadrants depicted in Figure 1? How often do you engage in each of them? How
    does your participation and behavior differ?
                                             508
               Foundations of Learning and Instructional Design Technology
Traditional views of motivation in education usually reduce down to two things: the
motivation to initially participate in a task and subsequently choosing to persist in the task
(Lepper, 1988). Motivation is also usually explained in terms of the extrinsic and intrinsic
reasons for choosing to participate (Facteau, Dobbins, Russell, Ladd & Kudisch, 1995, add a
third—compliance—for training environments). Extrinsic motivators are external to the
person, such as attaining rewards (e.g. pay increases, praise from teachers and parents), or
avoiding negative consequences (e.g. punishment, disapproval, losing one’s job). In contrast,
intrinsic motivators come from within the person, such as personal interest, curiosity, and
satisfaction. Malone’s (1981; Malone & Lepper, 1987) framework of intrinsic motivation is
based on the attributes of challenge, curiosity, fantasy, and control (other notable work in
this includes, of course, that of John Keller (1983; Keller & Suzuki, 1988). Challenge refers
not only to the level of difficulty but also to performance feedback for the player, and
includes goals, predictability of outcome, and self-esteem. Malone also warns against
designing games where the curiosity factor is sensory and superficial as opposed to games
in which curiosity engages deeper cognitive processes (see research by Rieber & Noah,
1997 for an example).
However, the dichotomy between extrinsic and intrinsic motivation quickly blurs in everyday
situations. An employee who loves his/her job will still rely on the social and professional
obligations of getting up and going to work in the morning from time to time. Students
forced to study for an upcoming test may unexpectantly find themselves enjoying the
material. Some extrinsic motivators are perceived as pure rewards or threats (e.g. read 10
books to earn a prize or do your homework every night to avoid a lower grade), but others
may be consistent with one’s goals or values (e.g. a teenager attending mandatory driver
education classes or an adult choosing to enroll in graduate school). Self-determination is
the degree to which one reconciles extrinsic motivators with personal choice (Deci & Ryan,
1985). A high degree of self-determination has been shown to affect the quality of one’s
learning (e.g. Ryan, Connell & Plant, 1990; see review by Rigby, Deci, Patrick & Ryan,
1992). In other words, the intrinsic worth of an activity is often a matter of personal choice
and learning can be enhanced when one looks for and finds personal motives to not only
participate but also to take responsibility for the outcome. [3] [#footnote-229-3]
Prescribing motivation in formal educational settings has long been a puzzle for teachers
and instructional designers. Part of the problem is that too many educators consider
motivation in terms of “that which gets someone else to do what we want them to.”
Instructional design models typically treat motivation as an “add-on” feature or concern.
Frequently, designers fall prey to first designing instruction from the point of view of the
subject matter and then ask “How can I make this motivating to the learner?” Instead,
motivation and learning should be considered together from the start. Likewise, serious play
is characterized by intense motivation coupled with goal-directed behavior.
For instructional designers, the task is to somehow blend or “wed” motivation to the
learning process. Fortunately, there is research and theory that describes this “marriage”
between motivation and learning, that of self-regulation (Butler & Winne, 1995; Schunk &
                                             509
               Foundations of Learning and Instructional Design Technology
Reconciling play with instructional design requires a very different perspective on the
relationship between curriculum, instruction, a teacher, and the individual learner. The
traditional view that one group of people (instructional designers, trainers, teachers) have
total authority and responsibility to create instructional activities for another group
(students) must be reconsidered. A modified view grants individual learners greater
authority over what they learn and how they learn it, while setting reasonable expectations
consistent with an institutional framework (e.g. school, workplace) (Papert, 1993, referred
to this as granting a student the “right to intellectual self-determination,” p. 5). This does
not negate the need for instruction, but rather puts structured learning experiences in the
context of supporting individual needs and learning goals, while at the same time
recognizing that many learning goals will necessarily be external to the individual, such as
skills needed in the workplace. This is in keeping with democratic ideals of education, such
as those proposed by Dewey (Glickman, 1996).
Experienced teachers are often able to invoke play and channel it toward achieving goals
                                              510
               Foundations of Learning and Instructional Design Technology
and objectives within the curriculum. For example, Richard McAfee is a high school social
studies teacher at Central Gwinnett High School in Lawrenceville, Georgia. He uses a
variety of simulation and gaming activities in his teaching. For example, he has fully
integrated the simulation software package SimCity into a unit in his economics course.
Here is Richard’s description of the unit:
      I take the first two days to teach the SimCity software to the students because I
      learned early on that students have a difficult time mastering the controls and
      tools well enough to complete their projects in the short amount of time we have
      set aside for the unit. Although the students have a lot of freedom in deciding
      how their cities will be constructed, everyone has the goal to create a city that is
      physically sound and provides its citizens with necessary resources. In addition,
      students are required to turn in three written reports – a transportation plan, a
      city services plan, and a physical plan. It’s remarkable how seriously students
      get into the process of building a city. Good ideas and strategies are both shared
      and guarded by students. By the end of the unit, my students literally run into
      the classroom to get back to their models. Of course, there are problems and not
      all students are equally successful in building a city that runs smoothly, but I
      find I can use all the problems and successes as a means for all students to
      understand the complex economic principles at work.
The intensity, seriousness, engagement, and enjoyment that Richard reports students
experience as they complete their SimCity models is an apt description of the play process.
Richard has found a way to let his students play with SimCity within a structure that is
consistent with the curriculum objectives that he (and the school district) values. Richard’s
attempt at integrating SimCity into his teaching and evoking play behavior in his students
while they are learning economics is in stark contrast to teachers who give students
software like SimCity to play as a reward for doing their “real work.” It is important to note
that this has not been easy for Richard. It has required a deliberate attempt at restructuring
his teaching requiring many hours of preparation. Of course, he could have spent that time
preparing “to teach” in the traditional way. The result would have been “traditional” as
well—the majority of students suffering through the material in order to pass the unit test. A
few would do very well, a few would fail, and the rest would be glad just to get through it. In
contrast, Richard’s approach gives students a chance to assume “ownership” of the learning
process through the act of building the model cities. The learning is richer and deeper even
though his “teaching” would be difficult to evaluate using traditional models of teacher
appraisal. Richard’s approach broadens the definition of instruction. While there is
forethought of outcomes, there is much more flexibility and opportunity to learn things that
are not predetermined. The students are responsible for learning certain things, but by
creating a playful atmosphere built on collaboration, the students come to value the learning
outcomes that Richard has set.
This paradoxical and almost contradictory situation of play being at once too complex to
                                              511
               Foundations of Learning and Instructional Design Technology
fully understand and predict yet an everyday phenomena just waiting to emerge is why we
have taken such an interest in microworlds, simulations, and games, especially those which
are computer-based (see Rieber, 1992; 1993; 1996 for discussions and examples). The
characteristics of these open-ended explorable learning environments, coupled with the
processing and networking capabilities of computers, offer many opportunities for serious
play. In particular, we have come to recognize the utility of games, not just for their
motivational characteristics, but also for the way they provide structure and organization to
complex domains. There is wonderful irony in rediscovering the technology of games—they
have historical and cultural significance, but because we experience games and game-like
situations continually throughout life, we tend to take them for granted.
Games are also a way of telling stories, and stories are fundamental to both understanding
and learning. Part of the power of games lies in the fact that through them we have a
chance to take part in cultural narratives. Playing Monopoly, for instance, is an opportunity
to participate in the drama of capitalism, playing chess gives us a chance to engage in a
story of conflict and resolution. Expert teachers often use stories to teach—some would
argue that all learning comes through stories, because all understanding is best conceived
as narrative (Schank, 1990).
The digital revolution has opened new possibilities for both gaming and education. The
software market sometimes seems driven by games, usually those marketed to the power
fantasies of adolescent boys; but new kinds of gaming environments are being made
possible by the spread of personal computers. Consequently, new kinds of educational
games have also been made possible, ones in which the motivational energy of sophisticated
multimedia productions has been joined to the responsiveness of interactive learner
engagement to create a gaming space that is motivating, complex, and individualized. The
field of computer gaming is barely two decades old and our ability to use this medium well is
just beginning to mature (some have suggested that the game Myst may be the first example
of a computer game justly considered as “literature”; see Carroll, 1997).
There are two distinct applications of games in education: game playing and game
designing. Game playing is the traditional approach where one provides ready made games
to students. This approach has a long history and, consequently, a well-established
literature. Game designing assumes that the act of building a game is itself a path to
learning, regardless of whether or not the game turns out to be interesting to other people.
The idea of “learning by designing” is similar to the old adage that teaching is the best way
to learn something. This approach has gained increased prominence due to the proliferation
of computer-based design and authoring tools.
Research has suggested that many instructional benefits may be derived from the use of
educational games (Dempsey, Lucassen, Gilley & Rasmussen, 1993-1994; Randel, Morris,
Wetzel & Whitehill, 1992). These benefits have been found to include improvement in
practical reasoning skills, motivational levels, and retention. Reports of the effectiveness of
educational games, measured as student involvement with the instructional task, have not
been as consistently favorable, though a breakdown of the available studies by subject
                                              512
               Foundations of Learning and Instructional Design Technology
matter reveals that some knowledge domains are particularly suited to gaming, such as
mathematics and language arts (Randel et al, 1992). Learning from designing games has
received far less attention. This approach turns powerful authoring tools and design
methodologies over to the students themselves. Consider the many projects produced in
graduate-level instructional design and multimedia classes. Even if no one in the “intended
audience” learns anything from the project, the designers themselves always know a great
deal more about the project’s content from the act of building it. Learning by designing is a
central idea in constructivism (Harel & Papert, 1990, 1992; Perkins, 1986) and game design
is beginning to attract attention in the constructivist literature (Kafai, 1992, 1994a, 1994b).
Likewise, our experiences with children support game design as an authentic, meaningful
approach for students to situate school learning (Rieber, Luke & Smith, 1998).
Instructional designers also need to give serious attention to the differential exposure of
boys and girls in gaming environments (Lever, 1976). Although choices of play activities
change for both boys and girls as they grow older, gender play preference differences are
found at all age levels (Almqvist, 1989; Beato, 1997; Clarke, 1995; Krantz, 1997; Paley,
1984; Provenzo, 1981). Until recently, however, few video games were designed with female
play preferences in mind. A survey by U.S. News and World Report (1996) indicated more
than 6 million U.S. households included females between 8 and 18 with access to
multimedia computers, yet there were relatively few computer games that were even
marketed to girls. As a result, girls were not playing these games in great numbers. Thus,
with greater hands-on experience, many boys regarded aspects of computers with greater
confidence and familiarity than girls (Wajcman, 1991). However, after years of disregard, it
now appears the industry is beginning to experience a change of heart. Some experts expect
200 new games, based on research that emphasizes girl’s play preferences, to reach store
shelves by the fall of 1997 (Beato, 1997). This is a tenfold increase from 1996. For example,
the company Purple Moon is specifically targeting the market of adolescent girls with help
from video game pioneer Brenda Laurel. Companies are finally recognizing that girls have
different interests and agendas. The stereotype that girls want “easy games” is also finally
disappearing. As Krantz (1997, p. 49) notes, “Girls don’t think boys’ games are hard; they
think they’re too stupid.” If girls are to have the same technological chances as boys, then
teachers and parents need to seek the inclusion of computer “play” materials in the
curriculum that motivates females as well as males.
Closing
Play is an essential part of the learning process throughout life and should not be neglected.
We feel that instructional design will benefit from recognizing this fact. Play that is serious
and focused within a learning environment can help learners construct a more personalized
and reflective understanding. As educators, our challenge is to implicate motivation into
learning through play, and to recognize that play has an important cognitive role in
learning. As instructional technologists, we have the opportunity to use the expanding
power of computers to provide new venues for play in learning—as simulations,
microworlds, and especially games.
                                             513
               Foundations of Learning and Instructional Design Technology
Computer games offer a new possibility for wedding motivation and self-regulated learning
within a constructivist framework, one which strives to combine both training and
education, practice and reflection, into a seamless learning experience. Computers are
making possible a new chapter to be written in the long history of games in education. The
issue of gender and learning is of particular importance to instructional technologists, since
technology is often seen as a male prerogative. Instructors and educational game designers
are beginning to have a better understanding of how gender differences affect learning, and
how to implement that understanding in better instructional design.
Research on computer programming by Sherry Turkle and Seymour Papert illustrate our
perspective on the value of play in instructional technology. Turkle and Papert’s research
(1991) contrasts two different programming styles that they describe as “hard” and “soft”
mastery. Hard mastery is compared to the clarity and control of the engineer or scientist,
while soft mastery is more like the give and take of a negotiator or artist. They equate soft
mastery to that of a bricoleur, or tinkerer. Elements are continually and playfully rearranged
to arrive at new combinations, often resulting in unexpected results. Just as Turkle and
Papert advocate that the computer culture looks beyond a single method of programming,
we advocate a variety of approaches to instructional design and learning. The value of play
should not be overlooked.
Application Exercise
          When was the last time you had serious play? What was it like? What allowed
          it to become serious play?
          Describe a time you found yourself in “flow.” What were you doing and how
          did you achieve flow?
          Randomly select one element from each of these lists: [Agriculture,
          Chemistry, Computer programming, Design skills, Math] [Toddlers, Sixth
          graders, Families, Young adults, the elderly] Using the principles from this
          chapter, design a game to teach _______ to _______.
References
Almqvist, B. (1989). Age and gender differences in children’s Christmas requests. Play and
Culture, 2, 2–19.
                                             514
               Foundations of Learning and Instructional Design Technology
Burton, J. K., Moore, D. M., & Magliaro, S. G. (1996). Behaviorism and instructional
technology. In D. Jonassen (Ed.), Handbook of research for educational communications and
technology, (pp. 46–73). Washington, DC: Association for Educational Communications and
Technology.
Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical
synthesis. Review of Educational Research, 65, 245–281.
Cameron, J., & Pierce, W. D. (1994). Reinforcement, reward, and intrinsic motivation: A
meta-analysis. Review of Educational Research, 64, 363–423.
Cameron, J., & Pierce, W. D. (1996). The debate about rewards and intrinsic motivation:
Protests and accusations do not alter the results. Review of Educational Research, 66(1),
39–51.
Clarke, E. (1995, April). Popular culture images of gender as reflected through young
children’s stories. Paper presented at the American Popular Culture Association Conference,
Chicago (ERIC Document ED 338 490).
Colarusso, C. A. (1993). Play in adulthood. Pscyhoanalytic Study of the Child, 48, 225–245.
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper
& Row.
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human
behavior. New York: Plenum Press.
Dempsey, J., Lucassen, B., Gilley, W., & Rasmussen, K. (1993-1994). Since Malone’s theory
of intrinsically motivating instruction: What’s the score in the gaming literature? Journal of
Educational Technology Systems, 22(2), 173–183.
Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: Implications for the design and
delivery of instruction. In D. Jonassen (Ed.), Handbook of research for educational
communications and technology, (pp. 170–198). Washington, DC: Association for
Educational Communications and Technology.
Facteau, J. D., Dobbins, G. H., Russell, J. E., Ladd, R. T., & Kudisch, J. D. (1995). The
influence of general perceptions of the training environment on pretraining motivation and
perceived training and transfer. Journal of Management, 21(1), 1–25.
                                             515
               Foundations of Learning and Instructional Design Technology
Glickman, C. (1996, April). Education as democracy: The pedagogy of school renewal. Paper
presented at the annual meeting of the American Educational Research Association, New
York.
Greene, D., & Lepper, M. R. (1974). Intrinsic motivation: How to turn play into work.
Psychology Today(September),136–140.
Harel, I., & Papert, S. (1990). Software design as a learning environment. Interactive
Learning Environments, 1, 1–32.
Kafai, M. B. (1992, April). Learning through design and play: Computer game design as a
context for children’s learning.Paper presented at the annual meeting of the American
Educational Research Association, San Francisco.
Keller, J. M., & Suzuki, K. (1988). Use of the ARCS motivation model in courseware design.
In D. Jonassen (Ed.), Instructional designs for microcomputer courseware, (pp. 401–434).
Hillsdale, NJ: Erlbaum.
Kerr, J. H., & Apter, M. J. (Eds.). (1991). Adult play: A reversal theory approach. Rockland,
MA: Swets & Zeitlinger.
Lepper, M., Greene, D., & Nisbett, R. (1973). Undermining children’s intrinsic interest with
extrinsic rewards: A test of the overjustification hypothesis. Journal of Personality and Social
Psychology, 28, 129–137.
Lepper, M. R., & Chabay, R. W. (1985). Intrinsic motivation and instruction: Conflicting
views on the role of motivational processes in computer-based education. Educational
                                              516
                Foundations of Learning and Instructional Design Technology
Lepper, M. R., Keavney, M., & Drake, M. (1996). Intrinsic motivation and extrinsic rewards:
A commentary on Cameron and Pierce’s Meta-analysis. Review of Educational Research,
66(1), 5–32.
Lever, J. (1976). Sex differences in the games children play. Social Problems, 23, 478–487.
Malone, T. W., & Lepper, M. R. (1987). Making learning fun: A taxonomy of intrinsic
motivations for learning. In R. E. Snow & M. J. Farr (Eds.), Aptitude, learning, and
instruction, III: Conative and affective process analysis, (pp. 223–253). Hillsdale, NJ:
Lawrence Erlbaum Associates.
Paley, V. G. (1984). Boys and girls: Superheroes in the doll corner. Chicago: University of
Chicago Press.
Papert, S. (1993). The children’s machine: Rethinking school in the age of the computer.
New York: BasicBooks.
Pellegrini, A. D. (Ed.). (1995). The future of play theory: A multidisciplinary inquiry into the
contributions of Brian Sutton-Smith. Albany, NY: State University of New York Press.
Randel, J. M., Morris, B. A., Wetzel, C. D., & Whitehill, B. V. (1992). The effectiveness of
games for educational purposes: A review of recent research. Simulation and gaming, 23,
261–276.
                                              517
               Foundations of Learning and Instructional Design Technology
Rieber, L. P., Luke, N., & Smith, J. (1998). Project KID DESIGNER: Constructivism at work
through play, : Meridian: Middle School Computer Technology Journal [On-line], 1(1).
Available http://www.ncsu.edu/meridian/index.html
Rieber, L. P., & Noah, D. (1997, March). Effect of gaming and graphical metaphors on
reflective cognition within computer-based simulations. Paper presented at the annual
meeting of the American Educational Research Association, Chicago.
Rigby, C. S., Deci, E. L., Patrick, B. C., & Ryan, R. M. (1992). Beyond the intrinsic-extrinsic
dichotomy: Self-determination in motivation and learning. Motivation and Emotion, 16(3),
165–185.
Ryan, R. M., Connell, J. P., & Plant, R. W. (1990). Emotions in non-directed text learning.
Learning and Individual Differences, 2, 1–17.
Schank, R. C. (1990). Tell me a story: a new look at real and artificial memory. New York:
Scribner.
Spolin, V. (1986). Theater games for the classroom: A teacher’s handbook. Chicago:
Northwestern University Press.
Turkle, S. (1984). The second self: Computers and the human spirit. New York: Simon &
Schuster.
Turkle, S., & Papert, S. (1991). Epistemological pluralism and the revaluation of the
concrete. In I. Harel & S. Papert (Eds.), Constructionism, (pp. 161–191). Norwood, NJ:
Ablex.
U.S. News & World Report. (1996). Press release [On-line]. Available:
http://www.sdsc.edu/users/woodka/games.html
Wajcman, J. (1991). Feminism confronts technology. University Park, PA: The Pennsylvania
State University Press.
Winn, W., & Snyder, D. (1996). Cognitive perspectives in psychology. In D. Jonassen (Ed.),
Handbook of research for educational communications and technology, (pp. 112–142).
Washington, DC: Association for Educational Communications and Technology.
                                              518
              Foundations of Learning and Instructional Design Technology
  1. An interesting example of the value of play in the creative process at the corporate
     level can be found at Avelino Associates, a San Francisco-based organizational
     development and systems integration firm. Their intent is to create a collaboration
     between technological and artistic professionals. Multi-talented performing artists are
     hired by Avelino for creative and organizational skills that are highly transferable
     between the technological and artistic modes (DeDanan, 1997). ↵ [#return-
     footnote-229-1]
  2. In 1963, Viola Spolin in conjunction with Paul Sills founded the Second City
     Improvisational Theater and as such laid the foundation for all improvisational
     companies since. ↵ [#return-footnote-229-2]
  3. There is considerable debate in the motivational literature over whether the intrinsic
     value of an activity can be undermined by the promise of external rewards, a
     phenomena often referred to as "turning play into work"—an unfortunate wording, in
     our opinion, because it promotes the misconception that play is the opposite of work.
     (See Cameron & Pierce, 1994; Cameron & Pierce, 1996; Greene & Lepper, 1974;
     Lepper, Greene & Nisbett, 1973; Lepper & Chabay, 1985; Lepper, Keavney & Drake,
     1996 for examples of the research and arguments surrounding this debate.) ↵
     [#return-footnote-229-3]
Suggested Citation
    Rieber, L., Smith, L. , & Noah, D (2018). The Value of Serious Play. In R. E. West,
    Foundations of Learning and Instructional Design Technology: The Past, Present,
    and Future of Learning and Instructional Design Technology. EdTech Books.
    Retrieved from https://edtechbooks.org/lidtfoundations/the_value_of_serious_play
                                            519
       Foundations of Learning and Instructional Design Technology
                                   520
                             Lloyd Rieber
Lloyd Rieber is originally from Pittsburgh Pennsylvania where he was born and
raised. He is now a Professor of Learning, Design, and Technology at the
University of Georgia. He was once a public school teacher in New Mexico and
in 1987 earned his Ph.D. at Penn State. Before going to the University of
Georgia, Lloyd spent six years on the faculty at Texas A&M. Lloyd’s research
interests include simulations, games, accessibility, microworlds and
visualization.
                                     521
Lola Smith
   522
David Noah
    523
                                        39
Editor’s Note
 The following paper is provided for free on the Internet by the authors and WCER.
 For more information from Kurt Squire, see the open-access article “Changing the
 Game: What Happens When Video Games Enter the Classroom?
 [https://edtechbooks.org/-oWI],” describing one of his case studies.
 Readers may make verbatim copies of this document for noncommercial purposes
 by any means, provided that the above copyright notice appears on all copies.
 WCER working papers are available on the Internet at
 http://www.wcer.wisc.edu/publications/ workingPapers/index.php
 [https://edtechbooks.org/-bK].
 Shaffer, D. W., Squire, K. R., Halverson, R., & Gee, J. P. (2005). Video Games and
 the Future of Learning.
                                         524
               Foundations of Learning and Instructional Design Technology
Computers are changing our world: how we work . . . how we shop . . . how we entertain
ourselves . . . how we communicate . . . how we engage in politics . . . how we care for our
health. . . . The list goes on and on. But will computers change the way we learn?
We answer: Yes. Computers are already changing the way we learn—and if you want to
understand how, look at video games. Look at video games, not because the games that are
currently available are going to replace schools as we know them any time soon, but
because they give a glimpse of how we might create new and more powerful ways to learn
in schools, communities, and workplaces—new ways to learn for a new information age.
Look at video games because, although they are wildly popular with adolescents and young
adults, they are more than just toys. Look at video games because they create new social
and cultural worlds: worlds that help people learn by integrating thinking, social interaction,
and technology, all in service of doing things they care about.
We want to be clear from the start that video games are no panacea. Like books and movies,
they can be used in antisocial ways. Games are inherently simplifications of reality, and
current games often incorporate—or are based on—violent and sometimes misogynistic
themes. Critics suggest that the lessons people learn from playing video games as they
currently exist are not always desirable. But even the harshest critics agree that we learn
something from playing video games. The question is: How can we use the power of video
games as a constructive force in schools, homes, and workplaces?
These rich virtual worlds are what make games such powerful contexts for learning. In
game worlds, learning no longer means confronting words and symbols separated from the
                                             525
               Foundations of Learning and Instructional Design Technology
things those words and symbols are about in the first place. The inverse square law of
gravity is no longer something understood solely through an equation; students can gain
virtual experience walking in worlds with smaller mass than the Earth, or plan manned
space flights that require understanding the changing effects of gravitational forces in
different parts of the solar system. In virtual worlds, learners experience the concrete
realities that words and symbols describe. Through such experiences, across multiple
contexts, learners can understand complex concepts without losing the connection between
abstract ideas and the real problems they can be used to solve. In other words, the virtual
worlds of games are powerful because they make it possible to develop situated
understanding.
Although the stereotype of the gamer is a lone teenager seated in front of a computer, game
play is also a thoroughly social phenomenon. The clearest examples are massively
multiplayer online games: games where thousands of players are simultaneously online at
any given time, participating in virtual worlds with their own economies, political systems,
and cultures. But careful study shows that most games—from console action games to PC
strategy games—have robust game-playing communities. Whereas schools largely sequester
students from one another and from the outside world, games bring players together,
competitively and cooperatively, into the virtual world of the game and the social community
of game players. In schools, students largely work alone with school-sanctioned materials;
avid gamers seek out news sites, read and write FAQs, participate in discussion forums, and
most important, become critical consumers of information (Squire, in press). Classroom
work rarely has an impact outside the classroom; its only real audience is the teacher. Game
players, in contrast, develop reputations in online communities, cultivate audiences by
contributing to discussion forums, and occasionally even take up careers as professional
gamers, traders of online commodities,[1] [#footnote-231-1] or game modders and designers.
The virtual worlds of games are powerful, in other words, because playing games means
developing a set of effective social practices.
By participating in these social practices, game players have an opportunity to explore new
identities. In one well-publicized case, a heated political contest erupted for the presidency
of Alphaville, one of the towns in The Sims Online. Arthur Baynes, the 21-year-old
incumbent, was running against Laura McKnight, a 14-year-old. The muckraking,
accusations of voter fraud, and political jockeying taught young Laura about the realities of
politics; the election also gained national attention on National Public Radio as pundits
debated the significance of games where teens could not only argue and debate politics, but
also run a political system in which the virtual lives of thousands of real players were at
stake. The complexity of Laura’s campaign, political alliances, and platform—a platform that
called for a stronger police force and a significant restructuring of the judicial
system—shows how deep the disconnect has become between the kinds of experiences
made available in schools and those available in online worlds. The virtual worlds of games
are rich contexts for learning because they make it possible for players to experiment with
new and powerful identities (Steinkuehler, 2004b).
                                             526
               Foundations of Learning and Instructional Design Technology
The communities that game players form similarly organize meaningful learning
experiences outside of school contexts. In the various Web sites devoted to the game
Civilization, for example, players organize themselves around the shared goal of developing
expertise in the game and the skills, habits, and understandings that requires. At
Apolyton.net, one such site, players post news feeds, participate in discussion forums, and
trade screenshots of the game. But they also run a radio station, exchange saved game files
in order to collaborate and compete, create custom modifications, and, perhaps most
uniquely, run their own university to teach other players to play the game more deeply.
Apolyton University shows us how part of expert gaming is developing a set of
values—values that highlight enlightened risk taking, entrepreneurship, and expertise
rather than the formal accreditation emphasized by institutional education (Squire &
Giovanetto, in press). If we look at the development of game communities, we see that part
of the power of games for learning is the way they develop shared values.
In other words, by creating virtual worlds, games integrate knowing and doing. But not just
knowing and doing. Games bring together ways of knowing, ways of doing, ways of being,
and ways of caring: the situated understandings, effective social practices, powerful
identities, and shared values that make someone an expert. The expertise might be that of a
modern soldier in Full Spectrum Warrior, a zoo operator in Zoo Tycoon, a world leader in
Civilization III. Or it might be expertise in the sophisticated practices of gaming
communities, such as those built around Age of Mythology or Civilization III.
There is a lot being learned in these games. But for some educators, it is hard to see the
educational potential in games because these virtual worlds aren’t about memorizing words,
or definitions, or facts.
But to know is a verb before it is a noun, knowledge. We learn by doing—not just by doing
any old thing, but by doing something as part of a larger community of people who share
common goals and ways of achieving those goals. We learn by becoming part of a
community of practice (Lave & Wenger, 1991) and thus developing that community’s ways
of knowing, acting, being, and caring—the community’s situated understandings, effective
social practices, powerful identities, and shared values.
Of course, different communities of practice have different ways of thinking and acting.
                                            527
               Foundations of Learning and Instructional Design Technology
Take, for example, lawyers. Lawyers act like lawyers. They identify themselves as lawyers.
They are interested in legal issues. And they know about the law. These skills, habits, and
understandings are made possible by looking at the world in a particular way—by thinking
like a lawyer. The same is true for doctors but through a different way of thinking. And for
architects, plumbers, steelworkers, and waiters as much as for physicists, historians, and
mathematicians.
Let’s look at an example of how this might play out in the virtual world of a video game. Full
Spectrum Warrior(Pandemic Studios, for PC and Xbox) is a video game based on a U.S.
Army training simulation.[2] [#footnote-231-2] But Full Spectrum Warrior is not a mere first-
person shooter in which the player blows up everything on the screen. To survive and win
the game, the player has to learn to think and act like a modern professional soldier.
In Full Spectrum Warrior, the player uses the buttons on the controller to give orders to two
squads of soldiers, as well as to consult a GPS device, radio for support, and communicate
with rear area commanders. The instruction manual that comes with the game makes it
clear from the outset that players must take on the values, identities, and ways of thinking
of a professional soldier to play the game successfully: “Everything about your squad,” the
manual explains, “is the result of careful planning and years of experience on the battlefield.
Respect that experience, soldier, since it’s what will keep your soldiers alive” (p. 2).
In the game, that experience—the skills and knowledge of professional military expertise—is
distributed between the virtual soldiers and the real-world player. The soldiers in the
player’s squads have been trained in movement formations; the role of the player is to select
the best position for them on the field. The virtual characters (the soldiers) know part of the
task (various movement formations), and the player knows another part (when and where to
engage in such formations). This kind of distribution holds for every aspect of military
knowledge in the game. However, the knowledge that is distributed between virtual soldiers
and real-world player is not a set of inert facts; what is distributed are the values, skills,
practices, and (yes) facts that constitute authentic military professional practice. This
simulation of the social context of knowing allows players to act as if in concert with
(artificially intelligent) others, even within the single-player context of the game.
In so doing, Full Spectrum Warrior shows how games take advantage of situated learning
environments. In games as in real life, people must be able to build meanings on the spot as
                                             528
               Foundations of Learning and Instructional Design Technology
they navigate their contexts. In Full Spectrum Warrior, players learn about suppression fire
through the concrete experiences they have while playing. These experiences give a
working definition of suppression fire, to be sure. But they also let a player come to
understand how the idea applies in different contexts, what it has to do with solving
particular kinds of problems, and how it relates to other practices in the domain, such as the
injunction against shooting while moving.
Video games thus make it possible to “learn by doing” on a grand scale—but not just by
wandering around in a rich computer environment to learn without any guidance. Asking
learners to act without explicit guidance—a form of learning often associated with a loose
interpretation of progressive pedagogy—reflects a bad theory of learning. Learners are
novices. Leaving them to float in rich experiences with no support triggers the very real
human penchant for finding creative but spurious patterns and generalizations. The fruitful
patterns or generalizations in any domain are the ones that are best recognized by those
who already know how to look at the domain and know how complex variables in the domain
interrelate. And this is precisely what the learner does not yet know. In Full Spectrum
Warrior, in contrast, the player is immersed in activity, values, and ways of seeing. But the
player is guided and supported by the knowledge built into the virtual soldiers and the
weapons, equipment, and environments in the game. Players are not left free to invent
everything for themselves. To succeed in the game, they must live by—and ultimately
master—the epistemic frame of military doctrine.
Full Spectrum Warrior immerses the player in the activities, values, and ways of seeing—the
epistemic frame—of a modern soldier. In this sense, it is an example of what we suggest is
the promise of video games and the future of learning: the development of epistemic games
(Shaffer, in press).
                                              529
               Foundations of Learning and Instructional Design Technology
Initiation
Developing games such as Full Spectrum Warrior that simultaneously build situated
understandings, effective social practices, powerful identities, shared values, and ways of
thinking is clearly no small task. But the good news is that in many cases existing
communities of practice have already done a lot of that work. Doctors know how to create
more doctors; lawyers know how to create more lawyers; the same is true for a host of other
socially valued communities of practice. Thus, we can imagine epistemic games in which
players learn biology by working as a surgeon, history by writing as a journalist,
mathematics by designing buildings as an architect or engineer, geography by fighting as a
soldier, or French by opening a restaurant—or more precisely, by inhabiting virtual worlds
based on the way surgeons, journalists, architects, soldiers, and restaurateurs develop their
epistemic frames.
To build such games requires understanding how practitioners develop their ways of
thinking and acting. Such understanding is uncovered through epistemographies of
practice: detailed ethnographic studies of how the epistemic frame of a community of
practice is developed by new members. That is more work than is currently invested in most
“educational” video games. But the payoff is that such work can become the basis for an
alternative educational model. Video games based on the training of socially valued
practitioners let us begin to build an educational system in which students learn to work
(and thus to think) as doctors, lawyers, architects, engineers, journalists, and other
important members of the community. The purpose of building such educational systems is
not to train students for these pursuits in the traditional sense of vocational education.
Rather, we develop those epistemic frames because they can provide students with an
opportunity to see the world in a variety of ways that are fundamentally grounded in
meaningful activity and well aligned with the core skills, habits, and understandings of a
postindustrial society (Shaffer, 2004b).
One early example of such a game is Madison 2200, an epistemic game based on the
practices of urban planning (Beckett & Shaffer, in press; Shaffer, in press). In Madison
2200, players learn about urban ecology by working as urban planners to redesign a
downtown pedestrian mall popular with local teenagers. Players get a project directive from
the mayor, addressed to them as city planners, including a city budget plan and letters from
concerned citizens about crime, revenue, jobs, waste, traffic, and affordable housing. A
video features interviews with local residents, business people, and community leaders
about these issues. Players conduct a site assessment of the street and work in teams to
develop a land use plan, which they present at the end of the game to a representative from
the city planning office.
Not surprisingly, along the way players learn something about urban planning and its
practices. But something very interesting happens in an epistemic game like Madison 2200.
When knowledge is first and foremost a form of activity and experience—of doing something
in the world within a community of practice—the facts and information eventually come for
                                            530
               Foundations of Learning and Instructional Design Technology
free. A large body of facts that resists out-of-context memorization and rote learning comes
easily if learners are immersed in activities and experiences that use these facts for plans,
goals, and purposes within a coherent knowledge domain. Data show that in Madison 2200,
players form—or start to form—an epistemic frame of urban planning. But they also develop
their understanding of ecology and are able to apply it to urban issues. As one player
commented: “I really noticed how [urban planners] have to . . . think about building things .
. . like, urban planners also have to think about how the crime rate might go up, or the
pollution or waste, depending on choices.” Another said about walking on the same streets
she had traversed before the workshop: “You notice things, like, that’s why they build a
house there, or that’s why they build a park there.”
The players in Madison 2200 do enjoy their work. But more important is that the experience
lets them inhabit an imaginary world in which they are urban planners. The world of
Madison 2200 recruits these players to new ways of thinking and acting as part of a new
way of seeing the world. Urban planners have a particular way of addressing urban issues.
By participating in an epistemic game based on urban planning, players begin to take on
that way of seeing the world. As a result, it is fun, too.
Transformation
Games like Full Spectrum Warrior and Madison 2200 expose novices to the ways
professionals make sense of typical problems. Other games are designed to transform the
ways of thinking of a professional community, focusing instead on atypical problems: places
where ways of knowing break down in the face of a new or challenging situation.
Just as games that initiate players into an epistemic frame depend on epistemographic study
of the training practices of a community, games designed to transform an epistemic frame
depend on detailed examination of how the mature epistemic frame of a practice is
organized and maintained—and on when and how the frame becomes problematic. These
critical moments of expectation failure (Schank, 1997) are the points of entry for
reorganizing experienced practitioners’ ways of thinking. Building the common assumptions
of an existing epistemic frame into a game allows experienced professionals to cut right to
the key learning moments.
For example, work on military leadership simulations has used goal-based scenarios
(Schank, 1992; Schank, Fano, Bell, & Jona, 1994) to build training simulations based on the
choices military leaders face when setting up a base of operations (Gordon, 2004). In the
business world, systems like RootMap (Root Learning, http://www.rootlearning.com) create
graphical representations of professional knowledge, offering suggestions for new practice
by surfacing breakdowns in conventional understanding (Squire, 2005). Studies of school
leaders similarly suggest that the way professionals frame problems has a strong impact on
the possible solutions they are willing and able to explore (Halverson, 2003, 2004). This
ability to successfully frame problems in complex systems is difficult to cultivate, but
Halverson and Rah (2004) have shown that a multimedia representation of successful
                                            531
               Foundations of Learning and Instructional Design Technology
Although there are not yet any complete epistemic games in wide circulation, there already
exist many games that provide similar opportunities for deeply situated learning. Rise of
Nations and Civilization III offer rich, interactive environments in which to explore
counterfactual historical claims and help players understand the operation of complex
historical modeling. Railroad Tycoon lets players engage in design activities that draw on
the same economic and geographic issues faced by railroad engineers in the 1800s. Madison
2200 shows the pedagogical potential of bringing students the experience of being city
planners, and we are in the process of developing projects that similarly let players work as
biomechanical engineers (Svarovsky & Shaffer, in press), journalists (Shaffer, 2004b),
professional mediators (Shaffer, 2004c), and graphic designers (Shaffer, 1997). Other
epistemic games might involve players experiencing the world as an evolutionary biologist
or as a tailor in colonial Williamsburg (Squire & Jenkins, 2004).
But even if we had the world’s best educational games produced and ready for parents,
teachers, and students to buy and play, it’s not clear that most educators or schools would
know what to do with them. Although the majority of students play video games, the
majority of teachers do not. Games, with their antiauthoritarian aesthetics and inherently
anti-Puritanical values, can be seen as challenging institutional education. Even if we strip
aside the blood and guts that characterize some video games, the reality is that as a form,
games encourage exploration, personalized meaning-making, individual expression, and
playful experimentation with social boundaries—all of which cut against the grain of the
social mores valued in school. In other words, even if we sanitize games, the theories of
learning embedded in them run counter to the current social organization of schooling. The
next challenge for game and school designers alike is to understand how to shape learning
and learning environments based on the power and potential of games—and how to
integrate games and game-based learning environments into the predominant arena for
                                             532
               Foundations of Learning and Instructional Design Technology
learning: schools.
How might school leaders and teachers bring more extended experiments with epistemic
games into the culture of the school? The first step will be for superintendents and public
spokespersons to move beyond the rhetoric of games as violent-serial-killer-inspiring-time-
wasters and address the range of learning opportunities that games present. Understanding
how games can provide powerful learning environments might go a long way toward shifting
the current anti-gaming rhetoric. Although epistemic games of the kind we describe here
are not yet on the radar of most educators, they are already being used by corporations, the
government, the military, and even by political groups to express ideas and teach facts,
principles, and world views. Schools and school systems must soon follow suit or risk being
swept aside.
Thus, we argue that to understand the future of learning, we have to look beyond schools to
the emerging arena of video games. We suggest that video games matter because they
present players with simulated worlds: worlds that, if well constructed, are not just about
facts or isolated skills, but embody particular social practices. And we argue that video
games thus make it possible for players to participate in valued communities of practice and
as a result develop the ways of thinking that organize those practices.
Our students will learn from video games. The questions are: Who will create these games,
and will they be based on sound theories of learning and socially conscious educational
practices? The U.S. Army, a longtime leader in simulations, is building games like Full
Spectrum Warrior and America’s Army—games that introduce civilians to military ideology.
Several homeland security games are under development, as are a range of games for
health education, from games to help kids with cancer take better care of themselves, to
simulations to help doctors perform surgery more effectively. Companies are developing
                                            533
               Foundations of Learning and Instructional Design Technology
games for learning history (Making History), engineering (Time Engineers), and the
mathematics of design (Homes of Our Own) (Squire & Jenkins, 2004).
This interest in games is encouraging, but most educational games to date have been
produced in the absence of any coherent theory of learning or underlying body of research.
We need to ask and answer important questions about this relatively new medium. We need
to understand how the conventions of good commercial games create compelling virtual
worlds. We need to understand how inhabiting a virtual world develops situated
knowledge—how playing a game like Civilization III, for example, mediates players’
conceptions of world history. We need to understand how spending thousands of hours
participating in the social, political, and economic systems of a virtual world develops
powerful identities and shared values (Squire, 2004). We need to understand how game
players develop effective social practices and skills in navigating complex systems, and how
those skills can support learning in other complex domains. And most of all, we need to
leverage these understandings to build games that develop for players the epistemic frames
of scientists, engineers, lawyers, political activists, and other valued communities of
practice—as well as games that can help transform those practices for experienced
professionals.
Video games have the potential to change the landscape of education as we know it. The
answers to fundamental questions such as these will make it possible to use video games to
move our system of education beyond the traditional academic disciplines—derived from
medieval scholarship and constituted within schools developed in the industrial
revolution—and towards a new model of learning through meaningful activity in virtual
worlds as preparation for meaningful activity in our postindustrial, technology-rich, real
world.
Application Exercises
          Create a rough outline of your idea for an educational video game. What
          would students learn? Would there be opportunities for social connection?
          How would you hope to see information transfer?
          Play a game on two different electronic platforms – iPhone, iPad, Computer,
          gaming console, etc – and share your thought on the educational value of the
          medium you used. Also share what limitations the mediums you used
          presents.
References
Beckett, K. L., & Shaffer, D. W. (in press). Augmented by reality: The pedagogical praxis of
                                             534
               Foundations of Learning and Instructional Design Technology
Gee, J. P. (in press). What will a state of the art video game look like? Innovate.
Halverson, R. (2003). Systems of practice: How leaders use artifacts to create professional
community in schools. Education Policy Analysis Archives, 11(37).
Halverson, R., & Rah, Y. (2004). Representing leadership for social justice: The case of
Franklin School. Under review by Journal of Cases in Educational Leadership.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation.
Cambridge, UK: Cambridge University Press.
Schank, R. C. (1992). Goal-based scenarios (Technical Report No. 36). Evanston, IL:
Northwestern University, The Institute for the Learning Sciences.
Schank, R. C., Fano, A., Bell, B., & Jona, M. (1994). The design of goal-based scenarios.
Journal of the Learning Sciences, 3, 305–345.
Shaffer, D. W. (2004a). Epistemic frames and islands of expertise: Learning from infusion
experiences. In Y. Kafai, W. A. Sandoval, N. Enyedy, A. S. Nixon, & F. Herrera (Eds.),
Proceedings of the Sixth International Conference of the Learning Sciences (pp. 473–480).
Mahwah, NJ: Erlbaum.
                                              535
               Foundations of Learning and Instructional Design Technology
Sizer, T. R. (1984). Horace’s compromise: The dilemma of the American high school. Boston:
Houghton Mifflin.
Squire, K. (2004). Sid Meier’s Civilization III. Simulations and Gaming, 35(1).
Squire, K. (2005). Game-based learning: Present and future of state of the field. Retrieved
May 31, 2005, from https://edtechbooks.org/-Fn
Squire, K., & Giovanetto, L. (in press). The higher education of gaming. eLearning.
Squire, K., & Jenkins, H. (2004). Harnessing the power of games in education. Insight, 3(1),
5–33.
Steinkuehler, C. A. (2004a, October). Emergent play. Paper presented at the State of Play
Conference, New York University Law School, NY.
                                             536
            Foundations of Learning and Instructional Design Technology
What is Gamification?
 “What is Gamification? This video provides a few ideas about gamification. The
 video [https://youtu.be/BqyvUvxOx0M] defines the term gamification, talks about
 the two types of gamification (structural and content) and gives an example of each
 type.The focus in mostly on educational uses of gamification.” — Karl Kapp
[https://edtechbooks.org/-PPD]
1. As Julian Dibbell, a journalist for Wired and Rolling Stone, has shown, it is possible to
   make a better living by trading online currencies than by working as a freelance
   journalist! ↵ [#return-footnote-231-1]
2. The commercial game retains about 15% of what was in the Army’s original
   simulation. For more on this game as a learning environment, see Gee (in press). ↵
   [#return-footnote-231-2]
                                           537
          Foundations of Learning and Instructional Design Technology
Suggested Citation
Shaffer, D. W. , Halverson, R. , Squire, K., & Gee, J. P. (2018). Video Games and the
Future of Learning. In R. E. West, Foundations of Learning and Instructional
Design Technology: The Past, Present, and Future of Learning and Instructional
Design Technology. EdTech Books. Retrieved from
https://edtechbooks.org/lidtfoundations/video_games_and_the_future_of_learning
                                        538
                   David Williamson Shaffer
Dr. David Shaffer was born in New York City, New York. He is currently a
professor of educational technology at the University of Wisconsin–-Madison. He
received his M.S. and Ph.D from the Massachusetts Institute of Technology’s
Media Laboratory. Prior to joining the University of Wisconsin–Madison, Dr.
Shaffer taught grades 4-12 both in the United States and abroad. Dr. Shaffer is
best known for his research involving computer games and learning. In 2008, he
founded an educational game development and consulting firm known as
EFGames, LLC.
                                     539
                         Richard Halverson
                                      540
                              Kurt Squire
                                      541
                             James P. Gee
                                      542
                                             40
Editor’s Note
    The following was reprinted from Emergence and Innovation in Digital Learning
    [https://edtechbooks.org/-uG], an open textbook edited by George Veletsianos.
    Baker, S., & Inventado, P. S. (2016). Educational data mining and learning
    analytics: Potentials and possibilities for online education. In G. Veletsianos (Ed.),
    Emergence and Innovation in Digital Learning (83–98).
    doi:10.15215/aupress/9781771991490.01
Over the last decades, online and distance education has become an increasingly prominent
part of the higher educational landscape (Allen & Seaman, 2008; O’Neill et al., 2004; Patel
& Patel, 2005). Many learners turn to distance education because it works better for their
schedule, and makes them feel more comfortable than traditional face-to-face courses
(O’Malley & McCraw, 1999). However, working with distance education presents challenges
for both learners and instructors that are not present in contexts where teachers can work
directly with their students. As learning is mediated through technology, learners have
fewer opportunities to communicate to instructors about areas in which they are struggling.
Though discussion forums provide an opportunity that many students use, and in fact some
students are more comfortable seeking help online than in person (Kitsantas & Chow, 2007),
discussion forums depend upon learners themselves realizing that they are facing a
challenge, and recognizing the need to seek help. Further, many students do not participate
in forums unless given explicit prompts or requirements (Dennen, 2005). Unfortunately, the
challenges of help-seeking are general: many learners, regardless of setting, do not
                                              543
                Foundations of Learning and Instructional Design Technology
successfully recognize the need to seek help, and fail to seek help in situations where it
could be extremely useful (Aleven et al., 2003). Without the opportunity to interact with
learners in a face-to-face setting, it is therefore harder for instructors as well to recognize
negative affect or disengagement among students.
In this chapter, we discuss educational data mining and learning analytics (Baker &
Siemens, 2014) as a set of emerging practices that may assist distance education instructors
in gaining a rich understanding of their students. The educational data mining (EDM) and
learning analytics (LA) communities are concerned with exploring the increasing amounts of
data now becoming available on learners, toward providing better information to instructors
and better support to learners. Through the use of automated discovery methods, leavened
with a workable understanding of educational theory, EDM/LA practitioners are able to
generate models that identify at-risk students so as to help instructors to offer better
learner support. In the interest of provoking thought and discussion, we focus on a few key
examples of the potentials of analytics, rather than exhaustively reviewing the increasing
literature on analytics and data mining for distance education.
                                               544
               Foundations of Learning and Instructional Design Technology
Massive Open Online Courses (MOOCs), another emerging distance education practice, also
generate large quantities of data that can be utilized for these purposes. There have been
dozens of papers exploiting MOOC data to answer research questions in education in the
brief time since large-scale MOOCs became internationally popular (see, for instance,
Champaign et al., 2014; Kim et al., 2014; Kizilcec et al., 2013). The second-largest MOOC
platform, edX, now makes large amounts of MOOC data available to any researcher in the
world. In addition, formats have emerged for MOOC data that are designed to facilitate
research (Veeramachaneni, Dernoncourt, Taylor, Pardos, & O’Reilly, 2013).
Increasingly, traditional universities are collecting the same types of data. For example,
Purdue University collects and integrates educational data from various systems including
content management systems (CMS), student information systems (SIS), audience response
systems, library systems and streaming media service systems (Arnold, 2010). This
institution uses this data in their Course Signals project, discussed below.
One of the key steps to making data useful for analysis is to pre-process it (Romero, Romero,
& Ventura, 2013). Pre-processing can include data cleaning (such as removing data
stemming from logging errors, or mapping meaningless identifiers to meaningful labels),
integrating data sources (typically taking the form of mapping identifiers—which could be at
the student level, the class level, the assignment level or other levels—between data sets of
tables), and feature engineering (distilling appropriate data to make a prediction). Typically,
the process of engineering and distilling appropriate features that can be used to represent
key aspects of the data is one of the most time-consuming and difficult steps in learning
analytics. The process of going from the initial features logged by an online learning system
(such as correctness and time, or the textual content of a post) to more semantic features
(history of correctness on a specific skill; how fast an action is compared to typical time
taken by other students on the same problem step; emotion expressed and context in a
discussion of a specific discussion forum post) involves considerable theoretical
understanding of the educational domain. This understanding is sometimes encoded in
schemes for formatting and storing data, such as the MOOC data format proposed by
Veeramachaneni et al. (2013) or the Pittsburgh Science of Learning Center DataShop
format (Koedinger, Baker, Cunningham, Skogsholm, Leber, & Stamper, 2010).
                                             545
                Foundations of Learning and Instructional Design Technology
automated interventions or empowering instructors is the goal. However, for the purposes
of this article, educational data mining and learning analytics can be treated as
interchangeable, as the methods relevant to distance education are seen in both
communities. Some of the differences emerge in the section on uses to benefit learners, with
the approaches around providing instructors with feedback being more closely linked to the
learning analytics community, whereas approaches to providing feedback and interventions
directly to students are more closely linked to practice in educational data mining.
In this section, we review the framework proposed by Baker and Siemens (2014); other
frameworks for understanding the types of EDM/LA method also exist (e.g., Baker & Yacef,
2009; Scheuer & McLaren, 2012; Romero & Ventura, 2007; Ferguson, 2012). The
differences between these frameworks are a matter of emphasis and categorization. For
example, parameter tuning is categorized as a method in Scheuer and McLaren (2012); it is
typically seen as a step in the prediction modeling or knowledge engineering process in
other frameworks. Still, mostly the same methods are present in all frameworks. Baker and
Siemens (in press) divide the world of EDM/LA methods into prediction modeling, structure
discovery, relationship mining, distillation of data for human judgment, and discovery with
models. In this chapter, we will provide definitions and examples for prediction, structure
discovery, and relationship mining, focusing on methods of particular usefulness for
distance education.
Prediction
Prediction modeling occurs when a researcher or practitioner develops a model, which can
infer (or predict) a single aspect of the data, from some combination of other variables
within the data. This is typically done either to infer a construct that is latent (such as
emotion), or to predict future outcomes. In these cases, good data on the predicted variable
is collected for a smaller data set, and then a model is created with the goal of predicting
that variable in a larger data set, or a future data set. The goal is to predict the construct in
future situations when data on it is unavailable. For example, a prediction model may be
developed to predict whether a student is likely to drop or fail a course (e.g., Arnold, 2010;
Ming & Ming, 2012). The prediction model may be developed from 2013 data, and then
utilized to make predictions early in the semester in 2014, 2015, and beyond. Similarly, the
model may be developed using data from four introductory courses, and then rolled out to
make predictions within a university’s full suite of introductory courses.
Prediction modeling has been utilized for an ever-increasing set of problems within the
domain of education, from inferring students’ knowledge of a certain topic (Corbett &
Anderson, 1995), to inferring a student’s emotional state (D’Mello, Craig, Witherspoon,
McDaniel, & Graesser. 2008). It is also used to make longer-term predictions, for instance
predicting whether a student will attend college from their learning and emotion in middle
school (San Pedro, Baker, & Gobert, 2013).
One key consideration when using prediction models is distilling the appropriate data to
                                               546
               Foundations of Learning and Instructional Design Technology
make a prediction (sometimes referred to as feature engineering). Sao Pedro et al. (2012)
have argued that integrating theoretical understanding into the data mining process leads
to better models than a purely bottom-up data-driven approach. Paquette, de Carvalho,
Baker, and Ocumpaugh (2014) correspondingly find that integrating theory into data mining
performs better than either approach alone. While choosing an appropriate algorithm is also
an important challenge (see discussion in Baker, 2014), switching algorithms often involves
a minimal change within a data mining tool, whereas distilling the correct features can be a
substantial challenge.
Another key consideration is making sure that data is validated appropriately for its
eventual use. Validating models on a range of content (Baker, Corbett, Roll, & Koedinger,
2008) and on a representative sample of eventual students (Ocumpaugh, Baker, Gowda,
Heffernan & Heffernan, 2014) is important to ensuring that models will be valid in the
contexts where they are applied. In the context of distance education, these issues can
merge: the population of students taking one course through a distance institution may be
quite different than the population taking a different course, even at the same institution.
Some prediction models have been validated to function accurately across higher education
institutions, which is a powerful demonstration of generality (Jayaprakash, Moody, Lauría,
Regan, & Baron, 2014).
As with other areas of education, prediction modeling increasingly plays an important role
in distance education. Arguably, it is the most prominent type of analytics within higher
education in general, and distance education specifically. For example, Ming and Ming
(2012) studied whether students’ final grades could be predicted from their interactions on
the University of Phoenix class discussion forums. They found that discussion of more
specialized topics was predictive of higher course grades. Another example is seen in
Kovacic’s (2010) work studying student dropout in the Open Polytechnic of New Zealand.
This work predicted student dropout from demographic factors, finding that students of
specific demographic groups were at much higher risk of failure than other students.
Related work can also be seen within the Purdue Signals Project (Arnold, 2010), which
mined content management system, student information system, and gradebook data to
predict which students were likely to drop out of a course and provide instructors with near
real-time updates regarding student performance and effort (Arnold & Pistilli, 2012;
Campbell, DeBlois, & Oblinger, 2007). These predictions were used to suggest interventions
to instructors. Instructors who used those interventions, reminding students of the steps
needed for success, and recommending face-to-face meetings, found that their students
engaged in more help-seeking, and had better course outcomes and significantly improved
retention rates (Arnold, 2010).
Structure Discovery
                                             547
               Foundations of Learning and Instructional Design Technology
found: a very different goal than in prediction. In prediction, there is a specific variable that
the researcher or practitioner attempts to infer or predict; by contrast, there are no specific
variables of interest in structure discovery. Instead, the researcher attempts to determine
what structure emerges naturally from the data. Common approaches to structure discovery
in LA/EDM include clustering, factor analysis, network analysis, and domain structure
discovery.
SNA has been used for a number of applications in education. For example, Kay,
Maisonneuve, Yacef, and Reimann (2006) used SNA to understand the differences between
effective and ineffective project groups, through visual analysis of the strength of group
connections. Although this project took place in the context of a face-to-face university class,
the data analyzed was from online collaboration tools that could have been used at a
distance. SNA has also been used to study how students’ communication behaviors in
discussion forums change over time (Haythornthwaite, 2001), and to study how students’
positions in a social network relate to their perception of being part of a learning community
(Dawson, 2008), a key concern for distance education. Patterns of interaction and
connectivity in learning communities are correlated to academic success as well as learner
sense of engagement in a course (Macfadyen & Dawson, 2010; Suthers & Rosen, 2011).
Relationship Mining
Association rule mining finds if-then rules that predict that if one variable value is found,
another variable is likely to have a characteristic value. Association rule mining has found a
wide range of applications in educational data mining, as well as in data mining and e-
commerce more broadly. For example, Ben-Naim, Bain, and Marcus (2009) used association
rule mining to find what patterns of performance were characteristic of successful students,
and used their findings as the basis of an engine that made recommendations to students.
Garcia, Romero, Ventura, and De Castro (2009) used association rule mining on data from
exercises, course forum participation, and grades in an online course, in order to gather
                                              548
               Foundations of Learning and Instructional Design Technology
Finally, correlation mining is the area of data mining that attempts to find simple linear
relationships between pairs of variables in a data set. Typically, in correlation mining,
approaches such as post-hoc statistical corrections are used to set a threshold on which
patterns are accepted; dimensionality reduction methods are also sometimes used to first
group variables before trying to correlate them to other variables. Correlation mining
methods may be useful in situations where there are a range of variables describing
distance education and a range of student outcomes, and the goal is to figure out an overall
pattern of which variables correspond to many successful outcomes rather than just a single
one.
Automated feedback to students about their learning and performance has a rich history
within online education. Many distance education courses today offer immediate
correctness feedback on pop-up quizzes or other problem-solving exercises (see Janicki &
Liegle, 2001; Jiang et al., 2014), as well as indicators of course progress. Research suggests
that providing distance education students with visualizations of their progress toward
completing competencies can lead to better outcomes (Grann & Bushway, 2014). Work in
recent decades in intelligent tutoring systems and other artificially intelligent technologies
shows that there is the potential to provide even more comprehensive feedback to learners.
In early work in this area, Cognitive Tutors for mathematics showed students “skill bars,”
giving indicators to students of their progress based on models of student knowledge
(Koedinger, Anderson, Hadley, & Mark, 1997). Skill bars have since been extended to
communicate hypotheses of what misconceptions the students may have (Bull, Quigley, &
                                             549
               Foundations of Learning and Instructional Design Technology
Mabbott, 2006). Other systems give students indicators of their performance across a
semester’s worth of subjects, helping them to identify what materials need further study
prior to a final exam (Kay & Lum, 2005). Some systems provide learners with feedback on
engagement as well as learning, reducing the frequency of disengaged behaviors (Walonoski
& Heffernan, 2006). These intelligent forms of feedback are still relatively uncommon within
distance education, but have the potential to increase in usage over time.
Similarly, feedback to instructors and other university personnel has a rich history in
learning analytics. The Purdue Signals Project (discussed above) is a successful example of
how instructors can be empowered with information concerning which students are at risk
of unsuccessful outcomes, and why each student is at risk. Systems such as ASSISTments
provide more fine-grained reports that communicate to instructors which skills are generally
difficult for students (Feng & Heffernan, 2007), influencing ongoing instructional strategies.
In the context of distance education, Mazza and Dimitrova (2004) have created
visualizations for instructors that represent student knowledge of a range of skills and
participation in discussion forums. Another example is TrAVis, which visualizes for
instructors the different online behaviors each student has engaged in (May, George, &
Prévôt, 2011). These systems can be integrated with tools to support instructors, such as
systems that propose types of emails to send to learners (see Arnold, 2010).
                                             550
               Foundations of Learning and Instructional Design Technology
validated using data relevant to their eventual use, involving similar systems and
populations. The invalid generalization of models creates the risk of inaccurate predictions
or responses.
Another important consideration is privacy. It is essential to balance the need for high-
quality longitudinal data (that enables analysis of the long-term impacts of a student
behavior or an intervention) with the necessity to protect student privacy and follow
relevant legislation. There is not currently a simple solution to the need to protect student
privacy; simply discarding all identifying information protects privacy, but at the cost of
potentially ignoring long-term negative effects from an intervention, or ignoring potential
long-term benefits.
Conclusion
Data mining and analytics have potential in distance education. In general, as with many
areas of education, distance education will be enhanced by the increasing amounts of data
now becoming available. There is potential to enhance the quality of course materials,
identify at-risk students, and provide better support both to learners and instructors. By
doing so, it may be possible to create learning experiences that create a level of individual
personalization better than what is seen in traditional in-person courses, instead emulating
the level of personalization characteristic of one-on-one tutoring experiences.
Application Exercises
          Name five ways educational data mining and learner analytics could help you
          design an online learning course.
          As taught in this chapter, “Research suggests that providing distance
          education students with visualizations of the progress toward completing
          competencies can lead to better outcomes.” Why do you think this is the
          case?
References
Aleven, V., Stahl, E., Schworm, S., Fischer, F., & Wallace, R. M. (2003). Help seeking and
                                             551
               Foundations of Learning and Instructional Design Technology
Allen, I. E., & Seaman, J. (2008). Staying the course: Online education in the United States,
2008. Needham, MA: Sloan Consortium.
Anderson, J. R., Matessa, M., & Lebiere, C. (1997). ACT-R: A theory of higher-level cognition
and its relation to visual attention. Human-Computer Interaction 12(4), 439–62.
Arroyo, I., Ferguson, K., Johns, J., Dragon, T., Meheranian, H., Fisher, D., Barto, A.,
Mahadevan, S., & Woolf, B. P. (2007, June). Repairing disengagement with non-invasive
interventions. In Proceedings of the 2007 Conference on Artificial Intelligence in Education:
Building Technology Rich Learning Contexts That Work (pp. 195–202). IOS Press.
Baker, R. S. (2014). Big data and education. New York: Teachers College, Columbia
University
Baker, R., & Siemens, G. (2014). Educational data mining and learning analytics. In K.
Sawyer (Ed.) Cambridge handbook of the learning sciences: 2nd Edition (pp. 253 – 274).
New York, NY: Cambridge University Press.
Baker, R. S., & Yacef, K. (2009). The state of educational data mining in 2009: A review and
future visions. Journal of Educational Data Mining 1(1), 3–17.
Baker, R. S. J. D., Corbett, A. T., Roll, I., & Koedinger, K. R. (2008). Developing a
generalizable detector of when students game the system. User Modeling and User-Adapted
Interaction 18(3), 287–314.
Ben-Naim, D., Bain, M., & Marcus, N. (2009). A user-driven and data-driven approach for
supporting teachers in reflection and adaptation of adaptive tutorials. In the Proceedings of
Educational Data Mining 2009 (pp. 21–30).
Biswas, G., Leelawong, K., Belynne, K., Viswanath, K., Schwartz, D., & Davis, J. (2004).
Developing learning by teaching environments that support self-regulated learning. In
Intelligent tutoring systems, 3220: Lecture notes in computer science, 730–40. Maceió,
Brazil: Springer.
Buckingham Shum, S., & Ferguson, R., (2012). Social learning analytics. Educational
Technology and Society 15(3), 3–26.
                                              552
               Foundations of Learning and Instructional Design Technology
promote reflection and learner autonomy, engineering education. Journal of the Higher
Education Academy Subject Centre 1(1), 8–18.
Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic analytics: A new tool for a
new era. Educause Review 42(4), 40.
Champaign, J., Colvin, K. F., Liu, A., Fredericks, C., Seaton, D., & Pritchard, D. E. (2014).
Correlating skill and improvement in 2 MOOCs with a student’s time on tasks. In
Proceedings of the First ACM Conference on Learning @ Scale Conference (pp. 11–20).
ACM.
Clow, D. (2014). Data wranglers: Human interpreters to help close the feedback loop. In
Proceedings of the Fourth International Conference on Learning Analytics and Knowledge,
LAK 2014 (pp. 49–53). New York: ACM.
Coffrin, C., Corrin, L., de Barba, P., & Kennedy, G. (2014). Visualizing patterns of student
engagement and performance in MOOCs. In Proceedings of the Fourth International
Conference on Learning Analytics and Knowledge (pp. 83–92). New York: ACM.
Corbett, A. T., & Anderson, J. R. (1995). Knowledge tracing: Modeling the acquisition of
procedural knowledge. User Modeling and User-Adapted Interaction, 4, 253–78. Corbett, A.,
Kauffman, L., Maclaren, B., Wagner, A., & Jones, E. (2010). A cognitive tutor for genetics
problem solving: Learning gains and student modeling. Journal of Educational Computing
Research 42(2), 219–39.
d’Aquin, M., & Jay, N. (2013). Interpreting data mining results with linked data for learning
analytics: Motivation, case study and directions. In Proceedings of the Third International
Conference on Learning Analytics and Knowledge, LAK 2013 (pp. 155–64). New York: ACM.
Dawson, S. (2008). A study of the relationship between student social networks and sense of
community. Educational Technology and Society 11(3), 224–38.
Dennen, V. P. (2005). From message posting to learning dialogues: Factors affecting learner
participation in asynchronous discussion. Distance Education 26(1), 127–48.
D’Mello, S. K., Craig, S. D., Witherspoon, A., McDaniel, B., & Graesser, A. (2008). Automatic
detection of learner’s affect from conversational cues. User Modeling and User Adapted
Interaction 18, 45–80.
Dyke, G., Howley, I., Adamson, D., Kumar, R., & Rosé, C. P. (2013). Towards academically
productive talk supported by conversational agents. In Productive multimodality in the
analysis of group interactions (pp. 459–76). New York: Springer US.
                                              553
                Foundations of Learning and Instructional Design Technology
Feng, M., & Heffernan, N. T. (2007). Towards live informing and automatic analyzing of
student learning: Reporting in ASSISTment system. Journal of Interactive Learning
Research, 18(2), 207–30.
García, E., Romero, C., Ventura, S., & De Castro, C. (2009). An architecture for making
recommendations to courseware authors using association rule mining and collaborative
filtering. User Modeling and User-Adapted Interaction 19(1–2), 99–132.
Goldstein, P. J., & Katz, R. N. (2005). Academic analytics: The uses of management
information and technology in higher education. Educause. Retrieved from
https://net.educause.edu/ir/library/pdf/ers0508/rs/ ers0508w.pdf
Grann, J., & Bushway, D. (2014). Competency map: Visualizing student learning to promote
student success. In Proceedings of the Fourth International Conference on Learning
Analytics And Knowledge (pp. 168–72). ACM.
Janicki, T., & Liegle, J. O. (2001). Development and evaluation of a framework for creating
web-based learning modules: a pedagogical and systems perspective. Journal of
Asynchronous Learning Networks 5(1), 58–84.
Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R., & Baron, J. D. (2014). Early
alert of academically at-risk students: An open source analytics initiative. Journal of
Learning Analytics 1(1), 6–47.
Jiang, S., Warschauer, M., Williams, A. E., O’Dowd, D., & Schenke, K. (2014). Predicting
MOOC Performance with Week 1 Behavior. In Proceedings of the 7th International
Conference on Educational Data Mining (pp. 273–75).
Kay, J., & Lum, A. (2005). Exploiting readily available web data for scrutable student
models, In Proceedings of the 12th International Conference on Artificial Intelligence in
Education (pp. 338–45), Amsterdam, Netherlands: IOS Press.
Kay, J., Maisonneuve, N., Yacef, K., & Reimann, P. (2006) The big five and visualisations of
team work activity. In Proceedings of the International Conference on Intelligent Tutoring
Systems (pp. 197–206).
Kim, J., Guo, P. J., Seaton, D. T., Mitros, P., Gajos, K. Z., & Miller, R. C. (2014, March).
Understanding in-video dropouts and interaction peaks in online lecture videos. In
Proceedings of the First ACM Conference on Learning@ Scale Conference (pp. 31–40).
                                               554
                Foundations of Learning and Instructional Design Technology
ACM.
Kitsantas, A., & Chow, A. (2007). College students’ perceived threat and preference for
seeking help in traditional, distributed, and distance learning environments. Computers and
Education 48(3), 383–95.
Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing disengagement: Analyzing
learner subpopulations in massive open online courses. In Proceedings of the Third
International Conference on Learning Analytics and Knowledge (pp. 170–79). ACM.
Knoke, D., & Yang, S. (eds.). (2008). Social network analysis (vol. 154), 2nd Ed. Thousand
Oaks, CA: Sage.
Koedinger, K. R., Anderson, J. R., Hadley, W. H., & Mark, M. A. (1997). Intelligent tutoring
goes to school in the big city. International Journal of Artificial Intelligence in Education 8,
30–43.
Koedinger, K. R., Baker, R.S. J. D., Cunningham, K., Skogsholm, A., Leber, B., & Stamper, J.
(2010). A data repository for the EDM community: The PSLC DataShop. In C. Romero, S.
Ventura, M. Pechenizkiy, & R. S. Baker, R. S. J. D. (eds.), Handbook of educational data
mining. Boca Raton, FL: CRC Press (pp. 43–56).
Kovacic, Z. (2010). Early prediction of student success: Mining students’ enrollment data. In
Proceedings of Informing Science and IT Education Conference (InSITE) 2010 (pp. 647–65).
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning
system” for educators: A proof of concept. Computers and Education, 54(2), 588–99.
May, M., George, S., & Prévôt, P. (2011). TrAVis to enhance online tutoring and learning
activities: Real-time visualization of students tracking data. Interactive Technology and
Smart Education 8(1), 52–69.
Mazza, R., & Dimitrova, V. (2004, May). Visualising student tracking data to support
instructors in web-based distance education. In Proceedings of the 13th International World
Wide Web Conference on Alternate Track Papers and Posters (pp. 154–61). ACM.
Ming, N. C., & Ming, V. L. (2012). Predicting student outcomes from unstructured data. In
Proceedings of the 2nd International Workshop on Personalization Approaches in Learning
Environments (pp. 11–16).
Mitrovic, A., & Ohlsson, S. (1999). Evaluation of a constraint-based tutor for a database.
International Journal of Artificial Intelligence in Education, 10, 238–56.
                                               555
               Foundations of Learning and Instructional Design Technology
Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014) Population
validity for educational data mining models: A case study in affect detection. British Journal
of Educational Technology, 45(3), 487–501.
O’Malley, J., & McCraw, H. (1999). Students’ perceptions of distance learning, online
learning and the traditional classroom. Online Journal of Distance Learning Administration,
2(4).
O’Neill, K., Singh, G., & O’Donoghue, J. (2004). Implementing elearning programmes for
higher education: A review of the literature. Journal of Information Technology Education
Research, 3(1), 313–23.
Palazuelos, C., García-Saiz, D., & Zorrilla, M. (2013). Social network analysis and data
mining: An application to the e-learning context. In J.-S. Pan, S.-M. Chen, & N.-T. Nguyen
(eds.). Computational collective intelligence. technologies and applications (pp. 651–60).
Berlin and Heidelberg: Springer.
Paquette, L., de Carvalho, A. M. J. A., Baker, R. S., & Ocumpaugh, J. (2014). Reengineering
the feature distillation Process: A case study in the detection of gaming the system. In
Proceedings of the 7th International Conference on Educational Data Mining (pp. 284–87).
Patel, C., & Patel, T. (2005). Exploring a joint model of conventional and online learning
systems. E-Service Journal, 4(2), 27–46.
Pavlik, P. I., & Anderson, J. R. (2008). Using a model to compute the optimal schedule of
practice. Journal of Experimental Psychology Applied, 14(2), 101.
Perera, D., Kay, J., Koprinska, I., Yacef, K., & Zaiane, O. R. (2009). Clustering and sequential
pattern mining of online collaborative learning data. IEEE Transactions on Knowledge and
Data Engineering, 21(6), 759–72.
Rabbany, R., Takaffoli, M., & Zaïane, O. R. (2011). Analyzing participation of students in
online courses using social network analysis techniques. In Proceedings of Educational Data
Mining (pp. 21–30).
Romero, C., & Ventura, S. (2007). Educational data mining: A survey from 1995 to Expert
Systems with Applications, 33(1), 135–46.
Romero, C., Romero, J. R., & Ventura, S. (2013). A survey on pre-processing educational
data. In Educational data mining (pp. 29–64). Berlin: Springer International Publishing.
San Pedro, M. O. Z., Baker, R. S. J. D., Bowers, A. J., & Heffernan, N.T. (2013). Predicting
college enrollment from student interaction with an intelligent tutoring system in middle
school. In Proceedings of the 6th International Conference on Educational Data Mining (pp.
177–84).
                                              556
               Foundations of Learning and Instructional Design Technology
Sao Pedro, M., Baker, R. S. J. D., & Gobert, J. (2012). Improving construct validity yields
better models of systematic inquiry, even with less information. In Proceedings of the 20th
International Conference on User Modeling, Adaptation and Personalization (UMAP 2012),
(pp. 249–60).
Scheuer, O., & McLaren, B. M. (2012). Educational data mining. In Encyclopedia of the
Sciences of Learning (pp. 1075–79). New York: Springer US.
Suthers, D., & Rosen, D. (2011). A unified framework for multilevel analysis of distributed
learning. In Proceedings of the 1st International Conference on Learning Analytics and
Knowledge (pp. 64–74).
Veeramachaneni, K., Dernoncourt, F., Taylor, C., Pardos, Z., & O’Reilly, U. M. (2013).
Developing data standards for MOOC data science. In AIED 2013 Workshops Proceedings
(p. 17). Berlin: Springer.
Suggested Citation
                                             557
          Foundations of Learning and Instructional Design Technology
                                        558
                            Ryan S. Baker
                                      559
                   Paul Salvador Inventado
                                     560
                                            41
Editor’s Note
    Farmer, T., & West, R. E. (2016). Opportunities and challenges with digital open
    badges. Educational Technology, 56(5), 45–48. Retrieved from
    http://www.academia.edu/29863552/Opportunities_and_Challenges_with_Digital_O
    pen_Badges
In 2011, Arne Duncan, Secretary of the U.S. Department of Education, gave a speech at the
MacArthur Foundation Digital Media and Lifelong Learning Competition and detailed the
need to establish certifications of achievement recognizing informal learning experiences.
He said, “Today’s technology-enabled, information-rich, deeply interconnected world means
learning not only can—but should—happen anywhere, anytime. We need to recognize these
experiences” (Duncan, 2011, para. 14). Informal learning settings such as web-based and
blended learning environments, after-school and extracurricular activities, and vocational
and work-based training programs are becoming increasingly prevalent. However,
participants in these environments have difficulty being recognized for the competencies
they develop.
This inability to recognize learning in informal contexts is one of many concerns with
                                             561
               Foundations of Learning and Instructional Design Technology
traditional assessing approaches. A second concern is that traditional credentials are not
always effective communicators of a student’s skill or knowledge. When a student is given
an “A” at the conclusion of a course, what does that grade symbolize? How easy is it for a
student, parent, or teacher to look inside that grade to discern the specific competencies
acquired by a particular student? On a larger scale, how easy is it for a potential employer
to analyze the degree and GPA of a prospective employee and understand the full range of
that prospect’s skills and competencies? Such indicators fail to provide a transparent
picture of an individual’s experience and qualifications.
These two challenges of how to recognize and reward informal learning, and how to
increase transparency in traditional grading practices are two credentialing challenges
begging for a solution. In the last several years, advances in the field of
microcrendentialing, specifically digital badging, has shown promise in solving these
assessment challenges.
Open badges are a unique type of digital badge with additional affordances built into the
technology that allow for the credential to be integrated into any compatible learning or
portfolio system. While some digital badges are useful indicators of learning within a closed
system (e.g. Khan Academy, Duolingo), open badges can be exported into open backpacks
that collect and display these microcredentials from many different formal and informal
learning systems.
Because of their digital and open affordances, open badges can also serve a variety of
functions, including as a map of learning pathways or trajectories (Bowen & Thomas, 2014;
Newby, Wright, Besser, & Beese, 2015; Gamrat & Zimmerman, 2015), “descriptions of
merit” (Rughinis & Matei, 2013), signposts of past and future learning (Rughinis & Matei,
2013), a reward or status symbol (Newby et al., 2015), promoters of motivation and self-
regulation (Newby et al., 2015; Randall, Harrison, & West, 2013), “tokens of
accomplishment” (O’Byrne, Schenke, Willis, & Hickey, 2015), a learning portfolio or
repository (Gamrat et al., 2014), and a goal-setting support (Gamrat & Zimmerman, 2015).
                                             562
               Foundations of Learning and Instructional Design Technology
Additionally, as badges increase learner autonomy and choice, they can also improve how
we guide and scaffold students to new, engaging, and personalized learning experiences
that are relevant to their preferences, abilities, and aptitudes. Indeed, Green, Facer, Rudd,
Dillon, and Humphreys (2005) argued that there were four key aspects of personalized
learning through digital technologies, including giving learners choices, recognizing
different forms of skills and knowledge, and learner-focused assessment. Open badges
address these key attributes of personalized learning by increasing learning options,
assessing discrete skills at a micro level, and credentialing learning both within and without
traditional formal institutions. These badges can then be organized into learning paths that
provide guidance to learners in particular domains. An example is from Codeschool
(https://edtechbooks.org/-jc), which uses paths to direct students through micro-learning
activities within certain areas. In this way, badges help scaffold students in taking
ownership of their learning process.
Digital badges not only illuminate the learning pathways for future learning, but can also
recognize learning experiences that previously have not been easily acknowledged through
a credential. By design, badges are microcredentials that display learning discrete
competencies along with relevant data. Mehta, Hull, Young, & Stoller, (2013) suggested that
this could potentially offer a solution to the medical training profession by helping medical
students gain important competencies while staying current on their learning. He suggested
that medical students could earn a badge for a specific procedure, test, or even medical
explanation. That badge would be displayed on the learner’s profile and would reflect their
learning across a variety of settings. Additionally, each badge could include an expiration
date that would ensure that medical professionals were current in their training, a feature
that has also been suggested for other domains such as teacher education (Randall et al.,
2013).
                                             563
               Foundations of Learning and Instructional Design Technology
The attention received by digital badges is increasing due to examples of successful badging
programs in secondary and higher education environments. Teacher Learning Journeys
(TLJ) developed through a partnership between Penn State University and NASA, National
Aeronautics, and the National Science Teachers Association (NSTA) provides an example of
a successful badging program for inservice teachers. This partnership worked together to
create 63 professional development activities as part of the TLJ for each teacher. Teachers
were asked to browse the various activities and plan which activities they wanted to
participate in to develop their teaching abilities. Additionally, teachers were offered two
levels of competencies for each activity: badges and stamps (a lower achievement). Through
a careful case study of program participants in TLJ, researchers discovered that the badging
structure provided learning pathways that allowed teachers to self-regulate their
professional development and learning. Teachers were given options of various content
badges, and could choose the level of performance they wanted to develop within the
desired content. This program included the principle of self-regulation that are important
characteristics in establishing higher levels of motivation (Pink, 2011).
Purdue University’s badging system, known as Passport, allows faculty members create,
design, and issue their own badges in support of all learning (Bowen & Thomas, 2014).
Passport has been a successful tool in establishing badges for intercultural learning courses,
educational technology courses, and even for LinkedIn proficiencies through the university’s
career center. By enabling faculty members to become badge creators, Purdue is
encouraging the development of an assessment culture based on transparency, competency,
and recognition.
Institutions of higher learning are not the only organizations experimenting with open
badges. Primary and secondary schools are also beginning to implement badging systems to
motivate, direct, and recognize student learning. The MOUSE Squad, an organization aimed
at helping disadvantaged students, utilizes badges to motivate, assess, and recognize
student learning both in school and with after school programs. A case study of the program
outlined the successful experience of a young girl named Zainab who immigrated to the
United States from Nigeria at age 12. Through engaging in the MOUSE program, Zainab
gained technological skills in a social collaborative experience to create a device for the
visually impaired that would alert them when food was placed on their plate. The skills and
competencies developed by Zainab were represented as badges on her college application
                                             564
                Foundations of Learning and Instructional Design Technology
and helped her earn a full scholarship to the University of Virginia (O’Byrne et al., 2015).
Badges can recognize learning beyond the physical walls of an organization as well as
beyond the typical organizational schedule. One leader in the area of digital badges,
although these badges are not open and compliant with the Open Badge Infrastructure, is
Khan Academy. In addition to course content, Khan Academy uses a digital badge structure
that acts as learning pathways for future learning as well as recognition of skills and
competencies previously developed. In addition to concrete content skills, Khan Academy is
notable for its collection of badges issued for “soft skills” such as listening, persistence, and
habit formation (“Badges,” 2015)—an idea that may begin to spread to open badge systems
as well.
With so many institutions experimenting with badging systems, it is possible that the flood
of badges is undermining the efforts to use badges as an effective assessment tool. In their
assessment of badges, West and Randall (2016) hypothesized that unless the badging
community can show how badges can be a rigorous and meaningful assessment tool, the
idea of badges will fade away without making any difference on the educational
environment. This flood of badges, particularly “lightweight” badges, can clutter the
badging landscape and hinder the ability for the end user (e.g. employer, academic
institution, etc.) to determine the value and quality of badges. Therefore, the responsibility
of the badging community is to create and issue badges that are rigorous and meaningful.
Another challenge to open badges is the struggle to be recognized outside of their native
badging ecosystem. In badging, an ecosystem is made up of badge developers, earners,
issuers, and end users that interact with each other to learn, display, and recognize
competencies. Ecosystems can be local in nature, where badges are intended to be used
within an individual’s learning space, or global where badges are designed to be displayed
and recognized beyond the institution’s community. While both badging ecosystems can
serve an important purpose, creating a global badging ecosystem requires organizations
outside the institution to recognize and accept the badge performance and assessment. This
recognition is difficult to achieve with institutions who have standards, requirements, and
objectives that often do not align. However, because of the portability of the open badge
technology, it is possible for like-minded institutions of learning to form consortiums where
badges could hold value with peer institutions within the consortium. Professional
organizations with a vested interest in those skills might consider endorsing these badges to
give them increased weight and importance (Ma, 2015).
Much like any start-up organization trying to enter into a new market, new ideas, such as
                                              565
               Foundations of Learning and Instructional Design Technology
open badges, require brand awareness by consumers to begin gaining cultural acceptance.
Generally speaking, consumers must be made aware through positive interactions with a
product or idea before they are willing to embrace it. Although open badges are becoming
more common in work and educational settings, a lack of awareness about badges persists.
Decision makers in government, business, and education appear to be generally unaware of
the potential of badges to motive, direct, and recognize learning.
The inability of badges to be diffused and implemented into a wider educational context may
be due to a larger struggle between traditional and competency-based grading. Competency
demands mastery of content and allows for the variables of time, resources, and location of
learning to vary (Reigeluth & Garfinkle, 1994). Traditional approaches to assessment allow
for student’s learning to vary while keeping other variables constant. Open badges can be
used in a competency approach to assessment that encourages students to redo and rework
problems until they have mastered the skill and fulfilled the requirements for the badge.
Conclusion
The inability to effectively recognize informal and formal learning competencies in
traditional business and educational contexts begs for new ways of assessment and new
forms of credentials. Well designed digital open badging systems offer potential solutions.
While badges are becoming increasingly common, proponents of widespread adoption of
badges face difficult challenges in creating common norms around the scope for badges and
the learning they represent, how to successfully build badge awareness and credibility that
extends beyond institutional boundaries, and how to effectively navigate to more
competency-based styles of assessment. What is needed for an innovation like open badges
to be successful, at this stage, are additional examples of effective badging practices, along
with rigorous research into the principles of quality badging. Scholars could study how
teachers, learners, and organizations have implemented open badging successfully, and
what challenges they have faced. Other research could investigate how to increase
awareness and acceptance of badge credentials, the most effective scope and granularity for
effective badges, how badges may or may not contribute to effective e-portfolios and
overcome the challenges these portfolios have traditionally faced, how to effectively scale
and manage badging systems, and how badges may contribute to enhanced motivation and
self-regulation. By exploring these and other issues, we can better determine whether open
badges are another technological fad, or a potentially disruptive innovation.
                                             566
               Foundations of Learning and Instructional Design Technology
Application Exercises
          What are two informal learning experiences you have participated in that
          could be assessed with an open badge?
          Think of a skill you would like to learn. Then, look for different resources that
          offer badges in that skill. Compare the resources, and pick one that you
          would prefer to use. Explain your choice.
          The authors list several challenges to spreading the use of badges more fully.
          Choose one of those barriers and share some strategies you think would help
          address that concern.
References
Badges. (Accessed 2015, December 14). Retrieved from
https://www.khanacademy.org/badges
Bowen, K., & Thomas, A. (2014). Badges: A common currency for learning. Change: The
Magazine of Higher Learning, 46(March 2015), 21–25.
http://doi.org/10.1080/00091383.2014.867206
Duncan, Arne. “Digital badges for learning.” MacArthur Foundation Digital Media and
Lifelong Learning Competition. Hirshhorn Museum, Washington D.C. September 15, 2011.
Retrieved from http://www.ed.gov/news/speeches/digital-badges-learning
Gamrat, C., & Zimmerman, H. T. (2015). An Online Badging System Supporting Educators’
STEM Learning. 2nd International Workshop for Open Badges in Education.
Gamrat, C., Zimmerman, H. T., Dudek, J., & Peck, K. (2014). Personalized workplace
learning: An exploratory study on digital badging within a teacher professional development
program. British Journal of Educational Technology, 45(6), 1136–1148.
http://doi.org/10.1111/bjet.12200
Gibson, D., Ostashewski, N., Flintoff, K., Grant, S., & Knight, E. (2015). Digital badges in
education. Education and Information Technologies, 20(2), 403–410.
http://doi.org/10.1007/s10639-013-9291-7
Green, H., Facer, K., Rudd, T., Dillon, P. & Humphreys, P. (2005). Personalisation and digital
technologies. Bristol: Futurelab. Retrieved February 17, 2016, from
http://www.nfer.ac.uk/publications/FUTL59/FUTL59_home.cfm
Jovanovic, J., & Devedzic, V. (2014). Open badges: Challenges and opportunities. Advances
                                              567
               Foundations of Learning and Instructional Design Technology
Ma, X. (2015, April). Evaluating the implication of open badges in an open learning
environment to higher education. In 2015 International Conference on Education Reform
and Modern Management. Atlantis Press. Retrieved February 17, 2016, from
http://www.atlantis-press.com/php/download_paper.php?id=20851
Mehta, N. B., Hull, A. L., Young, J. B., & Stoller, J. K. (2013). Just imagine. Academic
Medicine, 88(10), 1418–1423. http://doi.org/10.1097/ACM.0b013e3182a36a07
Newby, T. J., Wright, C., Besser, E., & Beese, E. (2015). Passport to creating and issuing
digital instructional badges. In D. Ifenthaler, N. Bellin-Mularski, & D. Mah (Eds.),
Foundations of digital badges and micro-credentials: Demonstrating and recognizing
knowledge and competencies. New York, NY: Springer.
O’Byrne, W. I., Schenke, K., Willis III, J. E., & Hickey, D. T. (2015). Digital badges:
Recognizing, assessing, and motivating learners in and out of school contexts. Journal of
Adolescent & Adult Literacy, 58(6), 451–454. http://doi.org/10.1002/jaal.381
Pink, D. H. (2011). Drive: The surprising truth about what motivates us. New York:
Riverhead Books.
Randall, B. D., Harrison, J. B., & West, R. E. (2013). Giving credit where credit is due:
Designing open badges for a technology integration course. TechTrends, 57(6), 88–96.
RughiniÅŸ, R., & Matei, S. (2013). Digital badges: Signposts and claims of achievement.
Communications in Computer and Information Science, 374, 84–88.
http://doi.org/10.1007/978-3-642-39476-8_18
University of California, Davis (2014). Sustainable agriculture & food systems (SA&FS):
learner-driven badges (Working Paper). Retrieved from Reconnect Learning website:
http://www.reconnectlearning.org/wp-content/uploads/2014/01/UC-Davis_case_study_final.p
df
West, R.E., Randall. D.L. (2016). The case for rigor in open badges. In L. Muilenburg & Z.
Berge, (Eds.), Digital badges in education: Trends, issues, and cases (pp. 21-29). New York,
NY: Routledge.
                                              568
          Foundations of Learning and Instructional Design Technology
Suggested Citation
West, R. E. & Farmer, T. (2018). Opportunities and Challenges with Digital Open
Badges. In R. E. West, Foundations of Learning and Instructional Design
Technology: The Past, Present, and Future of Learning and Instructional Design
Technology. EdTech Books. Retrieved from
https://edtechbooks.org/lidtfoundations/opportunities_and_challenges_with_digital_
open_badges
                                       569
                           Richard E. West
He tweets @richardewest, and his research can be found on Google Scholar and
his website: http://richardewest.com.
                                       570
                              Tadd Farmer
                                       571
         V. Becoming an LIDT Professional
Becoming an LIDT professional is more than knowing some theory, and having some design
or technical skills. You must learn how to network with other professionals, engage with
professional organizations, and perhaps (hopefully!) contribute your growing insights back
to the research and design literature through publication. These chapters will seek to guide
you on this journey, as well as in establishing a moral foundation for your work as an LIDT
professional.
                                            572
                                             42
Editor’s Note
    Osguthorpe, R. T., Osguthorpe, R. D., Jacob, W. J., & Davies, R. (2003). The Moral
    Dimensions of Instructional Design. Educational Technology, 43(2), 19–23.
While visiting a graduate student, who was completing her internship in a training division
of a large corporation, I (the senior author) was escorted by her supervisor into their
conference room. Pointing to immense charts covering each of the four walls, she explained,
“This is our instructional design model. We’re pretty proud of it.” As I examined the charts, I
was astonished at the level of detail. Each of the major components in the ADDIE model
included layer after layer of sub-steps. Trying not to judge the model too quickly, I asked,
“So, what do you see as the major benefit of this model over a more simplified one?”
“Oh,” she responded, “following this model helps us accomplish our overall goal of zero
defects.”
Somewhat puzzled, I asked, “Zero defects? You mean the model helps you find problems in
your company’s products.”
A little stunned, I asked, “And how do you decide that a person is defective?”
Without hesitating, she explained, “Any time a trainee answers a test item incorrectly, that’s
                                             573
               Foundations of Learning and Instructional Design Technology
considered a defect.”
Anyone who has worked extensively with instructional design models should not be
surprised at such an extreme application of their principles. The models are grounded on
what Jackson (1986) has called the “mimetic” tradition, which “gives a central place to the
transmission of factual and procedural knowledge from one person to another, through an
essentially imitative process” (p. 117). Mimetic instruction usually includes five steps that
are hauntingly similar to the ADDIE model: (1) test, (2) present, (3) perform/evaluate, (4)
reward correct performance/remediate incorrect performance, and (5) advance to the next
unit.
But there is another way to frame education: Jackson (1986) calls it the “transformative”
tradition. Rather than adding knowledge to a student’s brain—the goal of mimetic
instruction—transformative teaching attempts to change the student in a more fundamental
way (see Cranton, 1994). In this tradition, students change the way they see themselves in
relation to others and to the world around them. In transformative education a teacher
cares for students in their wholeness. The teacher is concerned not only with improvements
in test performance, but with improvements in character.
Teaching and teacher education have long considered their disciplines to be moral in
nature. Fenstermacher (1990) asserts that moral dimensions are always present when one
person is trying to teach another:
      What makes teaching a moral endeavor is that it is, quite centrally, human
      action undertaken in regard to other human beings. Thus, matters of what is
      fair, right, just, and virtuous are always present. When a teacher asks a student
      to share something with another student, decides between combatants in a
      schoolyard dispute; sets procedures for who will go first, second, third, and so
      on; or discusses the welfare of a student with another teacher, moral
      considerations are present. (p. 133)
So what if we replaced the word teacher with instructional designer? Because instructional
designers are usually not present when students are learning, should they be satisfied with
performance as the sole criterion for success? Can they ignore the broader, more
fundamental needs of their students—the transformative needs? To address these questions,
we will first present a case for viewing instructional design as a moral endeavor. Next we
will offer a framework for discussing the moral dimensions of the profession. Finally, we will
discuss ways the framework can be used to improve the practice of instructional design.
In each section we cite data from studies that we are currently conducting. In one study, 86
college students and 27 sixth graders reflected on and reported on their most frustrating
and most fulfilling learning experiences. In another study, in depth interviews were
conducted with 9 instructional designers asking them to reflect on their experience in
                                             574
               Foundations of Learning and Instructional Design Technology
When asked for the most common criticism of online courses, a director of evaluation at a
large center for instructional design, said, “That’s easy, students who don’t like online
courses usually say that the courses are too cook-booky.” The following student comment on
an experience with an online course reinforces this conclusion:
      In order to make the class testable, the professor focused on banalities that
      would easily fit a multiple choice format. The end result was that I
      binged/purged a lot of nonsense that I will never use in my life, rather than
      coming away with significant insights.
The course this student was evaluating clung firmly to the mimetic tradition of transferring
facts from computer screen to student, and at least for this student the course failed to
accomplish even this. So how can instructional designers avoid creating courses that do not
result in important student learning? We propose that just as the fields of teaching and
teacher education are beginning to embrace moral dimensions of their practice, so should
instructional design. Why would we make such a recommendation? For two reasons: (1)
Instructional design is as much a human endeavor as face-to-face teaching, and all human
endeavors are moral by nature, and (2) the more instructional designers focus on the higher
or deeper dimensions of learning and teaching that are ensconced in moral principles, the
more likely transformative learning will occur—both for the student and for the instructional
designer.
Before presenting our framework for the moral dimensions of instructional design, we will
explain what we mean by the word moral, or more accurately, what we do not mean. First,
we are not considering professional ethics as included in the book Instructional Design
Competencies: The Standards (Richey, Fields, & Foxon, 2001). Every worthy profession has
ethical codes of conduct. For instructional designers, these standards ensure that client and
societal needs and rights are not violated: e.g., instructional designers will not plagiarize
others’ work. Although these standards have clear moral implications, they have little to do
with the moral dimensions we refer to. Second, we are not suggesting the direct teaching of
virtues (e.g., slipping a little lesson on honesty into the online accounting course).
Our use of the word moral emphasizes neither ethical codes of conduct nor direct teaching
of virtues; rather we wish to focus on the ways in which instructional designers conduct and
view their work in relation to those who will use their instructional products. Thus the
practice of designing instructional interactions becomes a moral endeavor (see Hansen,
2001).
                                             575
               Foundations of Learning and Instructional Design Technology
Conscience of craft. Green (1999) identifies five different types of personal conscience.
We will briefly describe how each conscience relates to the instructional design profession.
The conscience of craft refers to one’s desire to adhere to often unstated but overarching
standards of one’s profession. While working on a piece of sculpture, the artist strives to
meet the standards of good art. Sometimes these standards have been made explicit, other
                                            576
               Foundations of Learning and Instructional Design Technology
times they are more illusive, but nonetheless powerful in directing the sculptor’s work.
Comparing her work to the most respected works of art, the sculptor constantly strives to
meet the highest standard—not because the piece will generate more profit, but because the
sculptor desires to be a good artist.
During an interview, an instructional designer who had helped develop a college course,
lamented how deadlines got in the way of quality work:
      We had a manuscript and we just started building things and we were literally
      finishing lessons the week before they were supposed to be going to the
      students. We recognized that it was just not a successful mode. In fact, I think
      only one-third of the students who took the course indicated that they would
      take another online course.
In contrast another student in an online course completed not only all of the requirements
but contributed to the online discussion three times more often than the average for the
class. In a class of 53, students on average accessed the discussion board 152 times, while
this student accessed it 421 times. And the quality of her contributions was clearly better
than most.
                                             577
               Foundations of Learning and Instructional Design Technology
The conscience of sacrifice applies equally to the designer. Is the designer totally honest
with the rest of the design team, with students who pilot the course? To what extent does
the designer act out of concern for those who will experience the instruction, as well as for
those who are working on the team?
      Okay, stop right there; you’re not being realistic. Instructional designers might
      enjoy acting on moral instincts of caring, of sacrificing, or promise keeping, but
      they are under constant pressure to produce—to deliver a product, and you
      can’t ask them to listen to these voices of conscience when they hardly have
      time to meet with the subject matter expert.
Our response to such criticism is that we recognize the constraints on designers, as on all
educators, to ignore the deeper, more far-reaching aspects of their work. But that is actually
the point. The more one ignores these fundamentally moral aims of one’s work, the less
effective will be the resulting product. Are the voices of conscience that Green proposes too
lofty? We think not. We argue that the field needs to reach deeper and higher at the same
moment if the discipline is to continue to develop in appropriate ways.
Conscience of memory. Green speaks of the conscience as a way of drawing upon one’s
past, of building on the traditions that are unique to an individual. He calls this type of
memory “rootedness.” Humans, he argues, have a powerful need to be rooted: to know
where they came from and what those before them were like. This seems to be an especially
neglected type of conscience in instructional design. Although the history of the field is
short, the past is often seen by new students as irrelevant. Some question the need to study
what instructional designers were doing before the internet was created. And yet there is
much in the history of the field to inform the present, much to propel the discipline in new
directions.
The conscience of memory also suggests that instructional designers need to draw more on
who they are as individuals. Such a stance argues for assigning instructional design projects
carefully, making sure that designers can draw upon their own unique strengths, talents,
and interests, as they design a new piece of instruction. And students need to have ways of
sharing who they are and how their own desires, goals, and experiences relate to the topic
being learned.
                                             578
               Foundations of Learning and Instructional Design Technology
Similarly, the students who experience the instruction produced by a good design must be
stretched to think in new ways. As Maxine Greene (2000) has explained so eloquently,
releasing the imagination of learners is the primary aim of good education. We assert that
for instructional designers to release the imagination of others, they must be working in
ways that improve their own imagination.
To illustrate how this type of reflection might help instructional designers, we offer the
following account given by an urban district superintendent, as she described with emotion
her own experience learning to read:
      By the fifth grade, I was [still] struggling with reading, and we had the
      wonderful—though in my memory not so wonderful—SRA kits. In our class, the
      teacher put the kits in front of the room. The levels of complexity of reading
      were [designated] by color. And, of course, we knew our colors. If you were
      able, you used the brown readers, if you were not able, you used the purple
      readers. It was to my great shame . . . because I was so shy, to go to the front of
      that class and pick up the purple. It was difficult—[there was] humiliation
      associated with it. (personal communication, Patti Harrington, October, 2001).
Even though this superintendent was recounting her story over 30 years later, the
recollection still brought with it significant emotion. Perhaps the label “purple” was like the
label “defective” used by the director of training we cited earlier. We do not believe that the
designers who created the color codes for SRA kits intentionally tried to humiliate children
any more than the director of training intentionally tried to humiliate employees. But that is
precisely the point: Instruction leads to unintended results, and without careful reflection,
those results can harm learners.
Although the SRA designers and the director of training may have reviewed performance
data, they were likely not reflecting on the more subtle moral effects of their design
decisions. And these moral effects, we assert, are more far reaching than performance data
                                              579
               Foundations of Learning and Instructional Design Technology
alone. These are the transforming effects, the effects of instruction that endure. And if
designers want to create instruction that will have positive rather than negative enduring
effects, we believe that they will need to focus on the moral dimensions. They will need to
engage more often in reflexive judgment, a kind of reflection that leads to personal
transformation for both the one who teaches and the one who learns.
References
Cranton, P. (1994). Understanding and promoting transformative learning. San Francisco:
Jossey-Bass.
Gordon, J., & Zemke, R. (2000). The attack on ISD. Training 37(42), 42–53.
Green,T. F. (1999). Voices: The educational formation of conscience. Notre Dame, IN: Notre
Dame Press.
Greene, M. (2000). Releasing the imagination: Essays on education, the arts, and social
change. San Francisco: Jossey-Bass.
Hansen, D. T. (2001). Exploring the moral heart of teaching: Toward a teacher’s creed. New
York: Teachers College Press.
Jackson, P. W. (1986). The practice of teaching. New York: Teachers College Press.
Osguthorpe, R. J., & Osguthorpe, R. T. (2001). Teaching and learning in virtuous ways: A
framework for guided reflection in moral development. Paper presented at the annual
meeting of the American Educational Research Association, Seattle, WA.
Richey, R. C., Fields, D. C., & Foxon, M. (2001). Instructional design competencies: The
standards. Syracuse, NY: ERIC Clearinghouse on Information and Technology.
                                             580
          Foundations of Learning and Instructional Design Technology
Suggested Citation
Osguthorpe, R. T., Osguthorpe, R. , Jacob, W. J., & Davies, R. S. (2018). The Moral
Dimensions of Instructional Design. In R. E. West, Foundations of Learning and
Instructional Design Technology: The Past, Present, and Future of Learning and
Instructional Design Technology. EdTech Books. Retrieved from
https://edtechbooks.org/lidtfoundations/instructional_design_moral_dimensions
                                        581
                     Russell T. Osguthorpe
Dr. Russell T. Osguthorpe is a past director of the Center for Teaching and
Learning at Brigham Young University (BYU). He received his PhD from BYU in
instructional psychology. He served on the faculty of the National Technical
Institute for the Deaf in Rochester, New York, before joining BYU. He has
authored five books and more than 50 journal articles on instructional design,
teacher education, and special education.
                                     582
                      Richard Osguthorpe
Dr. Richard Osguthorpe is the dean of the College of Education at Boise State
University. Prior to serving as dean, he was with Boise State for 10 years,
serving as the chair of the Department of Curriculum, Instruction, and
Foundational Studies for two of those years. He co-edited the book The Moral
Work of Teaching and Teacher Education: Preparing and Supporting
Practitioners, which was included in The U.K. Times higher education suggested
reading list for 2013.
                                     583
                           W. James Jacob
Dr. W. James Jacob is the director of the Institute for International Studies in
Education at the University of Pittsburgh, and he serves as an associate
professor in their School of Education. His primary focus is to further
comparative, international, and development education (CIDE) initiatives. He is
the author of several books related to CIDE, and he is the co-editor of two book
series concerning CIDE.
                                      584
                         Randall S. Davies
                                     585
                                             43
Editor’s Note
    The following article was originally published in TechTrends and is used here by
    permission.
    Lowenthal, P. R., Dunlap, J. C., & Stitson, P. (2016). Creating an intentional web
    presence: Strategies for every educational technology professional. TechTrends,
    60, 320–329. doi:10.1007/s11528-016-0056-1
Abstract
Educators are pushing for students, specifically graduates, to be digitally literate in order to
successfully read, write, contribute, and ultimately compete in the global market place.
Educational technology professionals, as a unique type of learning professional, need to be
not only digitally literate – leading and assisting teachers and students toward this goal, but
also model the digital fluency expected of an educational technology leader. Part of this
digital fluency involves effectively managing one’s web presence. In this article, we argue
that educational technology professionals need to practice what they preach by attending to
their own web presence. We share strategies for crafting the components of a vibrant and
dynamic professional web presence, such as creating a personal website, engaging in social
networking, contributing and sharing resources/artifacts, and attending to search engine
optimization (SEO).
                                              586
               Foundations of Learning and Instructional Design Technology
All of these scenarios are likely familiar. For us, the search committee vignette really
resonated. Over the years we have been on various search committees. Two things seemed
to have happened with every search: dozens of applicants met the minimum qualifications,
but very few applicants excited the search committee. When deciding whom to interview,
members of the search committee often turned to Google. Our experience, though, is not
unique. Research shows that employers regularly use the Internet to screen applicants
(Davison et al. 2012; Reicher 2013; Stoughton et al. 2013). But unlike in the past where
employers might only screen applicants to see if there is a reason not to hire someone, a
growing number of employers screen applicants to find a reason why they should hire
someone. For instance, a growing number of employers are simply looking for validation
that an applicant is the professional that he or she claims to be (which Joyce 2014a, refers to
as “social proof”); that is, these employers are looking to validate information found in an
applicant’s cover letter and resume (see Driscoll 2013; Huhman 2014; Joyce 2014a, b). In
fact, a growing number of employers report that they have found reasons to hire applicants
as the result of an Internet search (see Careerbuilder.com 2006, 2009, 2012, 2014). Thus,
we believe that one of the worst things that can happen for an applicant fighting for an
interview is for a search committee to find nothing of substance about an applicant from an
Internet search. Some people even believe that an empty Internet search suggests that an
applicant is out-of-date and/or lazy, has nothing to share, or worse, has something to hide
(Joyce 2014a, b; Mathews 2014); this is especially true for applicants in technology-focused
disciplines (e.g., instructional design and technology, information technology, computer
science, digital and graphic design) whose web presence also serves as reflections of their
technology skills and dispositions.
For these reasons, intentionally creating a well-crafted web presence, and corresponding
digital footprint, is important not only for recent graduates but for any professional in a
                                             587
               Foundations of Learning and Instructional Design Technology
community of practice that values technology use and innovation. In this article, we share
our thoughts as to why educational technology professionals need to attend to their web
presence and suggest a variety of ways in which they can begin crafting their online
presence and intentionally shaping their digital footprints.
Background
Despite an ongoing tension over the years about the role of technology (see Lowenthal and
Wilson 2010), the field of Educational Technology today is focused in large part on
technology (e.g., digital learning, online learning, mobile learning, social networking and
media). Further, reflecting how integrated and indispensable the Internet and social
networks/media are in our lives as tools and spaces for information curation and
communication (Fallows 2005; Yamamoto and Tanaka 2011), members increasingly use
technology to connect, collaborate, and grow in social networks. Therefore, professionals in
our field can no longer resist technology. Educational technology professionals must have a
web presence in order to actively participate in the social discourse; compete with
colleagues for positions and work; establish working and collaborative relationships with
colleagues, clients, and stakeholders; and stay current in an ever-changing discipline.
Educational technology professionals do not need to possess highly technical skills and
abilities but they must be digitally-literate leaders who openly model their digital fluency
and use it as a platform for creative practice and innovation. Being digitally literate and a
member of a professional community of practice involves effectively managing one’s web
presence (see Sheninger 2014).
Digital Literacy
Literacy is more than simply being able to read and write (Colombi and Schleppegrell 2002;
Street 1995). Literacy today, as Koltay (2011) explained, involves “visual, electronic, and
digital forms of expression and communication” (p. 214); this digital literacy includes a
robust knowledge of the affordances and limitations of digital tools and strategies to address
goals and needs in a variety of settings and contexts, plus the skill-set and disposition
necessary for critical thinking, social engagement, and innovation (Fraser 2012). Digital
literacy is much more than simply knowing how to use a computer or send a text message; a
digitally literate professional is able to “adapt to new and emerging technologies quickly
and pick up easily new semiotic language for communication as they arise” by embracing
“technical, cognitive and social-emotional perspectives of learning with digital technologies,
both online and offline” (Ng 2012, p. 1066). Graduates are now expected to be digitally
literate as they enter the workforce (Jones and Flannigan 2006; Weiner 2011). As such,
educators now have an added responsibility to help develop students’ digital literacy
throughout their formal education (Van Ouytsel et al. 2014; see related literature on digital
citizenship such as ISTE 2014; Hollandsworth et al. 2011; Ohler 2011). Educational
technology professionals, as a distinct type of educational professional, must not only be
digitally literate but also model their digital fluency, which in turn requires an advanced
understanding of how people interact online, as well as varying digital-literacy skills.
                                             588
               Foundations of Learning and Instructional Design Technology
An important, foundational aspect of being digitally literate involves being aware of and
managing one’s digital footprint. A digital footprint, according to Hewson (2013), “outlines a
person’s online activities, including their use of social networking platforms” (p. 14). A
digital footprint is therefore created whenever we use networked technology. However,
when left unattended, a digital footprint may fail to reflect what we want it to reflect about
ourselves professionally, emphasizing only our personal interactions and activities. For
example, during a recent faculty development workshop facilitated by the second author, a
group of faculty were surprised that their personal Facebook pages, Pinterest boards,
and/or Flickr photo collections came up on an Internet search before any professional
content. If professional content did come up on the first page of their search, the associated
pages were ones over which they had little direct control (e.g., their university faculty pages
and their “Rate My Professor” entry). Each of these faculty had what is sometimes described
as a digital shadow (Goodier and Czerniewicz 2015) or a passive digital footprint: a digital
footprint “that grows with no deliberate intervention from an individual” (Madden et al.
2007, p. 3).
Building one’s web presence (sometimes also referred to as “brand” or online “reputation”)
and actively monitoring and intentional shaping one’s digital footprint is a popular topic
these days (see Lowenthal and Dunlap 2012; Croxall 2014; Eyre et al. 2014; Goodier and
Czerniewicz 2015; Microsoft n.d.). While very little formal research has been conducted to
date on the positive benefits of a well-crafted web presence, people from various
fields—such as Career Planning (Tucker 2014), Librarianship (Von Drasek 2011), the
medical profession (Carroll and Ramachandran 2014; Greysen et al. 2010) to name a
few—are talking about the importance of professionals taking control over their digital
footprints by actively managing their web presence and therefore influencing the story that
the Internet has to tell about them.
                                             589
               Foundations of Learning and Instructional Design Technology
                                             590
               Foundations of Learning and Instructional Design Technology
Below we outline some common strategies to create an intentional web presence in order to
participate in, contribute to, and benefit from the larger professional community of practice.
The strategies we cover include creating a personal website, engaging in social networking,
contributing and sharing resources/artifacts, and attending to search engine optimization
(SEO). These strategies are based on our previous work and experience working with
faculty and students to establish an intentional web presence (Dunlap and Lowenthal 2009a,
2009b, 2011; Lowenthal and Dunlap 2012), but are also supported by the work of others
(e.g., Bozarth 2013, 2014; Goodier and Czerniewicz 2015; Posner et al. 2011; Sheninger
2014; Weller 2011).
The first step in creating a web presence is establishing a base camp—a place that serves as
a centralized hub of operation for all digital and online activity (see Marshall 2015;
Sheninger 2014). While many professionals might have a personal webpage or even a
multipage website on their employers’ servers, we recommend that educational technology
professionals set up personally controlled websites that are separate from employer-
sponsored sites. A personally controlled website is one that is under the full purview of the
individual whose work the site is showcasing; it is also a website that will persist over time
regardless of changes in employment, as well as help with search engine optimization,
which will be discussed later on in this article (see Corbyn 2010 for an in-depth discussion
on the value of a having a personally controlled website). A growing number of easy-to-use
tools are available for creating professional-looking websites (e.g., Wix, Weebly, Google
Sites, WordPress) for people without web development expertise.
A personally controlled website is also different than an ePortfolio created during and as a
culminating comprehensive assessment in a postsecondary program. The ePortfolios created
in university programs often include formative assessments of students’ progress during
                                             591
               Foundations of Learning and Instructional Design Technology
Using a personally controlled website as a base camp for professional activity conducted
online addresses several important web-presence goals:
     When participating in professional learning and sharing using social media and
     networking tools, it is helpful to have one central place to host and promote all
     professional activity.
     Having a base camp gives professionals a web presence that is under their control to
     ensure consistency and reliability over time; the professionals determine how they are
     presented professionally online, and when work and ideas are publicly shared.
     As professionals participate in social media and networking sites, they need a place to
     direct people to find out more about them and their work, and to stay connected. A
     base camp can help professionals accomplish this linkage.
     Having a base camp that allows professionals to post work and ideas (via blogging, for
     example) increases their ability to create and share content with others.
Your base camp represents where you are today. It states who you are, where you come
from, and what your strongest skill sets are. If you are a contractor, a personal website
establishes your relevance to the niche you work in. If you have a secure position, a
personally controlled website can be an asset to establishing your status as a thought leader
and valuable team member within your organization. In either situation, this online
transparency inherently states that you have confidence in your own skill set, which in turn
carries weight in many situations. We have found, though, that viewing your base camp as a
static website is unrealistic. You should plan to update the website once every 6 months at
minimum, whether you are in a secure job or not. Based on our experience working with
others to create web presence and reviewing personally controlled websites of other
professionals, a personally controlled website may include elements such as:
                                            592
               Foundations of Learning and Instructional Design Technology
Your personally controlled website is your business card, your résumé, and so much more.
In this sense, we believe that the look and feel of your personal website matters. You want
to communicate to others that design and details matter—competencies that appear in
position descriptions and employment announcements for educational technology
professionals (Martin and Winzler 2008; Ritzhaupt et al. 2010). Therefore, you should
purchase a personal domain name for your site and strive to avoid using common templates
that are regularly used by others online. In our experience, common templates fail to
highlight one’s personality or creativity; they can also feel dated over time and can
undermine credibility as a digitally literate professional because they fail to illustrate design
expertise. Also, when selecting a template, it is important to consider mobile friendliness as
many professionals use their mobile devices to access online content. Here are a few
examples of personally-controlled websites of educational technology professionals that we
believe are aesthetically pleasing while still meeting web-presence goals:
                                              593
               Foundations of Learning and Instructional Design Technology
           Facebook: With over 800 million active users, Facebook in many ways is the
           social network. While Facebook remains a primarily “personal” social network
           where people connect with friends and family, educational technology
           professionals might interact with dozens of Facebook groups. See Table 1 for
           examples of Educational Technology Facebook Groups. You can search
           Facebook for other groups that might better align with your professional
           interests.
     LinkedIn: LinkedIn is often seen as a primarily professional space. Your LinkedIn site
     allows you to share the details of your professional status without muddying up your
     base camp with such details. While arguably the drier and most professional location
     of your online presence, many feel more comfortable participating on this site. In
     addition, countless groups where educational technology professionals interact are
     available. Table 1 includes three examples of LinkedIn Groups. There are dozens of
     other groups to choose from. You can begin searching here:
     https://edtechbooks.org/-qj
     Twitter: Although Twitter may be seen as restrictive, given its 140-character-per-post
     limitation, Twitter offers educational technology professionals something Facebook
     and LinkedIn do not: an ability to follow someone without that person following you
                                            594
               Foundations of Learning and Instructional Design Technology
     back. Further, Twitter enables professionals to carefully craft a diverse social network
     that might include professionals in related fields but who would not show up as
     members of specific Facebook or LinkedIn groups. In addition, Twitter’s hashtaging
     functionality is often used to support backchannel conversations between participants
     during conferences and other larger scale events, making it a valuable communication
     and collaboration tool. Table 1 includes a resource that lists over a 100 tweet chats.
     You can discover additional ones over time on Twitter.
    Not sure where to begin on Twitter? Start by creating an account and following the
    tweets of professionals with similar interests. You can begin searching at
    https://edtechbooks.org/-jT. The following are some educational technology
    professionals who are very active on Twitter:
    Another strategy is to follow the tweets of your colleagues, notable scholars and
    authors who have influenced your thinking and work, professional organizations of
    which you are a member, and organizations who produce tools and technologies
    you use on a regularbasis. In this way, you will more quickly experience the value
    of Twitter in support of your professional learning and work.
It is important to point out, though, that using social networking for professional purposes
does not come naturally for everyone. For instance, some teachers face tensions using social
networking for professional purposes (Kimmons and Veletsianos 2014, 2015) and others
report the need for additional training and support (Joosten et al. 2013). With this in mind,
some educational technology professionals strive to keep a clean separation between their
personal and professional identities. If you wish to establish separate personal and
professional social-networking accounts, you can use pseudonyms or generic handles (e.g.,
                                            595
               Foundations of Learning and Instructional Design Technology
EdTech-Bob) to keep them separate. The disadvantage, though, of this approach is that it
could diminish, if only a little, the web-presence goal of establishing yourself—under your
name—as an active educational technology professional and thought leader.
Developing a web presence is not simply about having a website and only connecting with
others online. Your web presence should be strengthened by and extend and elaborate on
your overall engagement with the larger professional community of practice in face-to-face
settings. Whenever possible, educational technology professionals should network face-to-
face with other professionals in the field at conferences and workshops and through local
chapters of national/international professional organizations. In other words, we have found
that networking is not simply an online or face-to-face activity but rather an activity that
should take advantage of and leverage the affordances of both types of networking because
both enhance your professional presence.
                                             596
               Foundations of Learning and Instructional Design Technology
labor—the fine work they have produced that others may benefit from as well (see Tapscott
2012, for more on the value of sharing as a principle of openness in an open world). Shared
resources may include white papers, application recommendations, program evaluations,
reports of pilot studies, teaching and training materials, and creative works. These
resources can be shared online via social media sharing sites such as YouTube,
TeacherTube, SlideShare, Flickr, and even Amazon (e.g., through self-publishing as well as
book and product reviews). Social media sharing sites offer an opportunity to share your
expertise with a wider audience. Alternatively, there are many non social-media sites that
allow you to open-access distribute your materials—such as Google Drive, Scribd, Box,
Dropbox, and OneDrive—if conventional social media sites do not support the format and/or
size of certain materials. If you are employed in a situation in which the work you produce is
proprietary, then an appropriate solution may be to create an executive summary describing
the work and its value, with screen shots or an excerpt if allowable.
Sharing resources and artifacts is good practice for a few reasons. First, selected artifacts
can serve as a showcase portfolio that demonstrates your skills and abilities and areas of
expertise. Second, sharing work online helps build collaborations with others. Third, sharing
work online helps you further establish your digital footprint and present a clearer, more
complete story about the work you do. Finally, via this type of sharing, you help to establish
your credibility as an educational technology professional—the multiple resources and
artifacts available allow the audience to triangulate cognitive authority, information quality,
and overall relevance and value of your contributions to the professional community of
practice (Hilligoss and Rieh 2008)—and may also enhance your employer’s credibility by
association (Metzger 2007). The following social media sharing sites are popular,
established, and full-featured, making them ideal for professional resource and artifact
sharing:
Content curation of others’ work is a key facet of professional web presence and can help
you find your professional community of practice. Through curating the work of others, not
only do you develop relationships with others, but you become a player in solving larger
problems; you show that you are continuously learning and ever improving your skills. This
transparency will help others realize your worth. And, of course, having access to others’
fine resources and artifacts can be helpful in your own work! Here are a few tools that you
can use to start publicly curating content:
                                             597
               Foundations of Learning and Instructional Design Technology
Search Engine Optimization (SEO) is the process of improving the ability to locate and
access work online from a specific set of search terms. SEO is one final but necessary
component to crafting your web presence and intentionally shaping your digital footprint
(Lowenthal and Dunlap 2012). Educational technology professionals need to improve the
accessibility and reach of the work they share online by thinking about how people will find
said work via an Internet search, and then making modifications to how work is presented
online to increase the likelihood of others finding it online. This is an important aspect of
web presence because—let’s face it—for individuals who rely on the Internet for
professional learning and networking, if search engines like Google cannot find your work
then it is inaccessible and does not fully contribute to the professional community of
practice or enhance your web presence.
The most important rule of SEO is to create and share quality content. But another aspect of
creating quality content is creating content that others find valuable and want to read and
use. Creating and sharing similar and consistent content also helps boost your SEO. Thus,
as an educational technology professional, you need to think about what you want to share
and what you want to be known for. Then, carefully consider where you share your content
as well as how you name and tag it. Some websites get more traffic than others, usually the
more visitors the better when it comes to SEO and web presence. For instance, commercial
websites like Youtube (100+ million monthly visits) and Slideshare (1.75 million monthly
visits) get much more traffic than OER sites like MIT’s OpenCourseWare (200,000 monthly
visits) and Merlot (17,000 monthly visits) (see Weller 2011). Therefore, sharing your work
on high trafficked sites like these can help increase the SEO and overall visibility of your
work, as can sharing the same work on multiple websites (e.g., sharing the same slide deck
on Slideshare, Academia.edu, and your personally controlled website). Most social media
and networking websites also give you some control over how you name, tag, and describe
your work. A quick Internet search for similar work from other professionals can help you
get a better idea of how best to name, describe, and tag your work.
Finally, we recommend that you spend some time tracking and analyzing the analytics on
your personally controlled website (e.g., with Google Analytics) as well as various social
networking and social media websites you might regularly use (e.g., Slideshare and
Academia) to be better informed on which of your work is most valued by your professional
community of practice. You should also spend time tracking topics online (e.g., with Google
                                             598
               Foundations of Learning and Instructional Design Technology
Alerts or Twitter #searches) that are important to your work so that you may continue to
connect and collaborate with like-minded individuals.
Patty’s Story
Conclusion
To be a successful, lifelong educational technology professional, you need to be digitally
literate and model digital fluency in your day-to-day professional activities, including
effectively managing your web presence. The strategies shared above will help you craft the
components of a vibrant and dynamic professional web presence. However, we want to
stress that there is no one right way for educational technology professionals to establish
and maintain a web presence. As illustrated in Fig. 1 and previously discussed, you can tend
to your web presence in multiple ways. Each professional needs to craft a web presence that
is appropriate given the professional audience(s) she/he is trying to attract and connect
with, and feels comfortable, authentic, and sustainable over time. Intentionally building a
                                            599
               Foundations of Learning and Instructional Design Technology
web presence takes time and effort; the key is doing it in a way that leads to positive results
by taking control of the story the web tells about you.
Application Exercises
References
Atkins, D. E., Brown, J. S., & Hammond, A. L. (2007). A review of the Open Educational
Resources (OER) movement: Achievements, challenges, and new opportunities. Retrieved
from https://edtechbooks.org/-au.
Bozarth, J. (2013, May). Show your work. Training and Development. Retrieved from
https://edtechbooks.org/-MZ.
Brown, J. S., & Adler, R. P. (2008). Minds on fire: open education, the long tail, and learning
2.0. EDUCAUSE Review, 43(1), 16–20.
Google Scholar [https://edtechbooks.org/-DrG]
Careerbuilder.com. (2006). One-in-four hiring managers have used Internet search engines
to screen job candidates; one-in-ten have used social networking sites, careerbuilder.com
survey finds. Retrieved from https://edtechbooks.org/-De.
                                             600
               Foundations of Learning and Instructional Design Technology
Carroll, C. L., & Ramachandran, P. (2014). The intelligent use of digital tools and social
media in practice management. Chest Journal, 145(4), 896–902.
CrossRef [https://edtechbooks.org/-oJ] Google Scholar [https://edtechbooks.org/-Zr]
Clark, D. (2011). E-portfolios: 7 reasons why I don’t want my life in a shoebox. Donald Clark
Plan B. Retrieved from https://edtechbooks.org/-PE.
Colombi, M. C., & Schleppegrell, M. J. (2002). Theory and practice in the development of
advanced literacy. In M. J. Schleppegrell & M. C. Colombi (Eds.), Developing advanced
literacy in first and second languages: Meaning with power(pp. 1–19). Mahwah, NJ:
Lawrence Erlbaum Associates.
Corbyn, Z. (2010). All about me, dot com. Times Higher Education. Retrieved from
https://edtechbooks.org/-Yd.
Couros, A. (2010). Developing personal learning networks for open and social learning. In G.
Veletsianos (Ed.), Emerging technologies in distance education (pp. 109–128). Athabasca:
Athabasca University Press.
Google Scholar [https://edtechbooks.org/-RcY]
Croxall, B. (2014). How to overcome what scares us about our online identities. The
Chronicle of Higher Education.Retrieved from https://edtechbooks.org/-gf.
Davison, H. K., Maraist, C. C., Hamilton, R. H., & Bing, M. N. (2012). To screen or not to
screen? Using the Internet for selection decisions. Employee Responsibilities and Rights
Journal, 24(1), 1–21.
CrossRef [https://edtechbooks.org/-uFD] Google Scholar [https://edtechbooks.org/-Uo]
Dunlap, J. C., & Lowenthal, P. R. (2009a). Horton hears a tweet. EDUCAUSE Quarterly,
32(4).
Dunlap, J. C., & Lowenthal, P. R. (2009b). Tweeting the night away: Using Twitter to
enhance social presence. Journal of Information Systems Education, 20(2), 129–136.
Dunlap, J. C., & Lowenthal, P. R. (2011). Learning, unlearning, and relearning: Using Web
                                             601
               Foundations of Learning and Instructional Design Technology
Driscoll, E. (2013). What your social media reputation says to employers. Fox Business.
Retrieved from https://edtechbooks.org/-HS.
Eyre, S., Lindsay, K., Noble, H., Edwards, A., & Paddock, A. (2014). Online presence:
Developing your presence. Retrieved from https://edtechbooks.org/-SX.
Fallows, D. (2005). Search engine users: Internet searchers are confident, satisfied and
trusting—But they are also unaware and naïve. Report for the Pew Internet and American
Life Project. Retrieved from https://edtechbooks.org/-bc.
Fraser, J. (2012). 20 ways of thinking about digital literacy in higher education. The
Guardian. Retrieved from https://edtechbooks.org/-RH.
Godin, S. (2008). Tribes: We need you to lead us. New York: Portfolio/Penguin Group.
Google Scholar [https://edtechbooks.org/-yw]
Goodier, S., & Czerniewicz, L. (2015). Academics’ online presence guidelines: A four step
guide to taking control of your visibility (3rd ed.). Retrieved from
https://edtechbooks.org/-aR.
Greysen, S. R., Kind, T., & Chretien, K. C. (2010). Online professionalism and the mirror of
social media. Journal of General Internal Medicine, 25(11), 1227–1229.
CrossRef [https://edtechbooks.org/-yhs] Google Scholar [https://edtechbooks.org/-rnw]
Hargittai, E., & King, B. (2013). You need a website. Inside HigherEd. Retrieved from
https://edtechbooks.org/-cM.
Henry, A. (2012). Should I keep my personal and professional identities completely separate
online? Lifehacker. Retrieved from https://edtechbooks.org/-RJ.
Hewson, K. (2013). What size is your digital footprint? A powerful professional learning
network can give a boost to a new teaching career. Phi Delta Kappan, 94(7), 14.
CrossRef [https://edtechbooks.org/-tk] Google Scholar [https://edtechbooks.org/-zM]
Hodgkinson-Williams, C., & Gray, E. (2009). Degrees of openness: the emergence of open
education resources at the University of Cape Town. International Journal of Education and
Development using Information and Communication Technology (IJEDICT), 5(5), 101–116.
                                             602
               Foundations of Learning and Instructional Design Technology
Hollandsworth, R., Dowdy, L., & Donovan, J. (2011). Digital citizenship in K-12: it takes a
village. TechTrends, 55(4), 37–47.
CrossRef [https://edtechbooks.org/-bR] Google Scholar [https://edtechbooks.org/-JA]
Huhman, H. R. (2014). 4 things employers look for when they Google you. Business Insider.
Retrieved from https://edtechbooks.org/-Jv.
Jones, B., & Flannigan, S. L. (2006). Connecting the digital dots: literacy of the 21st century.
EDUCAUSE Quarterly, 29(2), 8–10.
Google Scholar [https://edtechbooks.org/-Vo]
Joosten, T. (2012). Social media for educators: Strategies and best practices. San Francisco:
Jossey-Bass.
Joosten, T., Pasquini, L., & Harness, L. (2013). Guiding social media at our institutions.
Planning for Higher Education, 41(2), 125–135.
Google Scholar [https://edtechbooks.org/-tjY]
Joyce, S. P. (2014). To land a job, know how employers use technology to hire. The
Huffington Post. Retrieved from https://edtechbooks.org/-wo.
Joyce, S. P. (2014). What 80% of employers do before inviting you for an interview. The
Huffington Post. Retrieved from https://edtechbooks.org/-Fm.
Kimmons, R., & Veletsianos, G. (2014). The Fragmented educator 2.0: Social networking
sites. Acceptable identity fragments, and the identity constellation. Computers & Education,
72, 292–301.
CrossRef [https://edtechbooks.org/-bx] Google Scholar [https://edtechbooks.org/-oq]
Kimmons, R., & Veletsianos, G. (2015). Teacher professionalization in the age of social
networking sites. Learning, Media and Technology, 40(4), 480–501.
Koltay, T. (2011). The media and the literacies: media literacy, information literacy, digital
literacy. Media, Culture & Society, 33(2), 211–221.
CrossRef [https://edtechbooks.org/-zKw] Google Scholar [https://edtechbooks.org/-rx]
Kop, R., & Hill, A. (2008). Connectivism: Learning theory of the future or vestige of the
past? International Review of Research in Open and Distance Learning, 9(3). Retrieved from
https://edtechbooks.org/-eE.
Kowalski, R. M., Giumetti, G. W., Schroeder, A. N., & Lattanner, M. R. (2014). Bullying in
the digital age: a critical review and meta-analysis of cyberbullying research among youth.
                                              603
               Foundations of Learning and Instructional Design Technology
Lowenthal, P., & Wilson, B. G. (2010). Labels do matter! A critique of AECT’s redefinition of
the field. TechTrends, 54(1), 38–46.
Lowenthal, P. R., & Dunlap, J. C. (2012). Intentional web presence: Ten SEO strategies
every academic should know. EDUCAUSE Review Online. Retrieved from
http://www.educause.edu/ero/article/intentional-web-presence-10-seo-strategies-every-acade
mic-needs-know.
Lowenthal, P., White, J. W., & Cooley, K. (2011). Remake/remodel: Using eportfolios and a
system of gates to improve student assessment and program evaluation. International
Journal of ePortfolio, 1(1), 61–70.
Madden, M., Fox, S., Smith, A., & Vitak, J. (2007). Online identify management and search
in the age of transparency. Pew Internet & American Life Project. Retrieved from
https://edtechbooks.org/-qm.
Martin, F., & Winzler, B. (2008). Multimedia competencies for instructional technologist.
Paper presented at the UNC Teaching and Learning with Technology Conference, Raleigh,
NC.
Mathews, J. (2014). Screen yourself in: 5 tips to make your online presence interview-
worthy. TalentEgg. Retrieved from https://edtechbooks.org/-os.
Metzger, M. (2007). Making sense of credibility on the web: models for evaluating online
information and recommendations for future research. Journal of the American Society for
Information Science and Technology, 58(13), 2078–2091.
CrossRef [https://edtechbooks.org/-eX] Google Scholar [https://edtechbooks.org/-Fjc]
Microsoft. (n.d.). Take charge of your online reputation. Microsoft. Retrieved from
http://www.microsoft.com/security/online-privacy/reputation.aspx.
Ng, W. (2012). Can we teach digital natives digital literacy? Computers & Education, 59(3),
1065–1078.
CrossRef [https://edtechbooks.org/-akW] Google Scholar [https://edtechbooks.org/-RS]
Ohler, J. (2011). Digital citizenship means character education for the digital age. Kappa
Delta Pi Record, 47, 25–27.
CrossRef [https://edtechbooks.org/-oy] Google Scholar [https://edtechbooks.org/-jf]
Pettiward, J., &. O’Reilly, C. (n.d.). Get clued up! Find yourself online. Retrieved from
                                              604
               Foundations of Learning and Instructional Design Technology
https://edtechbooks.org/-qb.
Poppick, S. (2014). 10 social media blunders that cost a millennial a job – or worse. Money.
Retrieved from https://edtechbooks.org/-ig.
Posner, M., Varner, S., & Croxall, B. (2011). Creating your web presence: A primer for
academics. The Chronicle of Higher Education. Retrieved from https://edtechbooks.org/-Ji.
Reicher, A. (2013). Background of our being: internet background checks in the hiring
process. The Berkeley Technology Law Journal, 28, 116–154.
Google Scholar [https://edtechbooks.org/-gJL]
Ritzhaupt, A., Martin, F., & Daniels, K. (2010). Multimedia competencies for an educational
technologist: a survey of professionals and job announcement analysis. Journal of
Educational Multimedia and Hypermedia, 19(4), 421–449.
Google Scholar [https://edtechbooks.org/-QE]
Sheninger, E. (2014). Digital leadership: changing paradigms for changing times. Thousand
Oaks: Corwin.
Google Scholar [https://edtechbooks.org/-PN]
Siemens, G. (2008). Learning and Knowing in Networks: Changing roles for Educators and
Designers. Retrieved from https://edtechbooks.org/-YM.
Stoughton, J. W., Thompson, L. F., & Meade, A. W. (2013). Examining applicant reactions to
the use of social networking websites in pre-employment screening. Journal of Business and
Psychology, 30(1), 73–88.
CrossRef [https://edtechbooks.org/-ICH] Google Scholar [https://edtechbooks.org/-EA]
Tapscott, D. (2012). Four principles for the open world [transcript of TED Talk video].
Retrieved from https://edtechbooks.org/-WP.
Tucker, K. (2014). Personal branding in career communications. Career Planning and Adult
Development Journal, 30(2), 47–52.
Google Scholar [https://edtechbooks.org/-NI]
Van Ouytsel, J., Walrave, M., & Ponnet, K. (2014). How schools can help their students to
strengthen their online reputations. The Clearing House: A Journal of Educational
Strategies, Issues and Ideas, 87(4), 180–185.
CrossRef [https://edtechbooks.org/-Ny] Google Scholar [https://edtechbooks.org/-uv]
Veletsianos, G., & Kimmons, R. (2013). Scholars and faculty members’ lived experiences in
                                             605
               Foundations of Learning and Instructional Design Technology
online social networks. The Internet and Higher Education, 16, 43–50.
CrossRef [https://edtechbooks.org/-Zk] Google Scholar [https://edtechbooks.org/-Zz]
Veletsianos, G., Kimmons, R., & French, K. D. (2013). Instructor experiences with a social
networking site in a higher education setting: expectations, frustrations, appropriation, and
compartmentalization. Educational Technology Research and Development, 61(2), 255–278.
CrossRef [https://edtechbooks.org/-no] Google Scholar [https://edtechbooks.org/-FXe]
Von Drasek, L. (2011). Hang in there: how to get a library job against all odds. School
Library Journal, 57(2), 24–29.
Google Scholar [https://edtechbooks.org/-vDQ]
Weiner, S. (2011). Information literacy and the workforce: a review. Education Libraries,
34(2), 7–14.
Google Scholar [https://edtechbooks.org/-kkG]
Weller, M. (2011). The digital scholar: How technology is transforming scholarly practice.
New York: Bloomsbury.
CrossRef [https://edtechbooks.org/-Efo] Google Scholar [https://edtechbooks.org/-vRB]
Yamamoto, Y., & Tanaka, K. (2011). Enhancing credibility judgment of web search results.
CHI’11 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
(pp. 1235–1244). Association of Computing Machinery: New York.
Suggested Citation
    Lowenthal, P. R., Dunlap, J. C., & Stitson, P. (2018). Creating an Intentional Web
    Presence: Strategies for Every Educational Technology Professional. In R. E. West,
    Foundations of Learning and Instructional Design Technology: The Past, Present,
    and Future of Learning and Instructional Design Technology. EdTech Books.
    Retrieved from
    https://edtechbooks.org/lidtfoundations/creating_an_intentional_web_presence
                                             606
       Foundations of Learning and Instructional Design Technology
                                   607
                      Patrick R. Lowenthal
                                     608
                         Joanna C. Dunlap
                                      609
                            Patricia Stitson
                                       610
                                            44
Editor’s Note
The following article was published in Educational Technology with this citation:
For information on open access journals in the field of educational technology, see Ross
Perkins and Patrick Lowenthal’s analysis of the top OA journals in the field
[https://edtechbooks.org/-ebS].
The purpose of this study was to examine (1) the academic prestige and visibility of peer-
reviewed journals within the field of educational technology, and (2) the factors influencing
an individual’s choice to publish within a specific journal. Seventy-nine educational
technology professionals responded to an online survey designed to address the
aforementioned concerns. The authors’ results suggest that educational technology
professionals generally agree that some publication venues stand out among others. In
particular, Educational Technology Research and Development, British Journal of
Educational Technology, and Computers & Education had the highest visibility and prestige
ratings of all peer-reviewed journals within the study. Additionally, the results suggest that
                                             611
               Foundations of Learning and Instructional Design Technology
when one chooses to publish within a particular journal, the fit of the manuscript within the
journal, the aims and intent of the journal, and the target audience are among the most
important factors.
Introduction
Where should educational technologists publish their research articles? This is a question
that is quite common among academic circles in the field of educational technology.
Although this seems to be a trivial question at first glance, when one considers the number
of publication outlets available (59 within this study), the pressure on faculty members to
publish, and the impact of publishing on tenure and promotion, the question is no longer
trivial from a faculty member’s perspective (Hardgrave & Walstrom, 1997). Given that
publishing research articles plays an extremely important function for faculty members, and
that tenure and promotion decisions are greatly influenced by the perceived value of
publications, determining which journals to use for publication is important, especially in
light of the limited knowledge of multidisciplinary tenure and promotion committees (Bray,
2003; Carr-Chellman, 2006; Elbeck & Mandernach, 2009; Hannafin, 1991; Holcomb, Bray,
& Dorr, 2003).
Relevant Literature
Though publishing in the field of educational technology is an important topic, very little
literature has been published on the subject. In an analysis of scholarly productivity in
educational technology, Hannafin (1991) had 23 faculty members within the field identify,
classify, and rank leading educational technology journals. The study identified the five
leading basic research journals as Educational and Communication Technology Journal (now
Educational Technology Research and Development), Journal of Educational Psychology,
American Educational Research Journal, Instructional Science, and the Journal of Computer-
Based Instruction. In contrast, the leading applied journals in the field were the Journal of
Instructional Development, Educational Technology (magazine), Journal of Performance and
Instruction, Phi Delta Kappan, and TechTrends. However, this classification of basic and
                                             612
               Foundations of Learning and Instructional Design Technology
applied may not be a fully accurate way to categorize these publication venues.
Holcomb, Bray, and Dorr (2003) examined 30 journals within the field of educational
technology on academic prestige, general reading, and classroom use. The research study
invited members of the Association for Educational Communications and Technology (AECT)
to respond to a survey evaluating the respective publication venues within the field. The
findings of the study showed the five overall top publication venues included Educational
Technology Research and Development, Cognition and Instruction, Educational Technology
(magazine), Journal of Research on Computing in Education (now Journal of Research on
Technology in Education), and Journal of Educational Computing Research.
Carr-Chellman (2006) examined the question of where successful emerging scholars are
most likely to publish their research. This study considered the publication records of 17
emerging scholars (pre-tenure) from 16 universities. The emerging scholars published a
total of 252 discrete papers in journals or magazines, or approximately 15 articles per
scholar in the pre-tenure period. The sample of scholars most frequently published in
Educational Technology Research and Development, TechTrends, Journal of Educational
Computing Research, Computers in Human Behavior, and the Journal of Research on
Technology in Education. The average scholar profile that emerges from these data includes
15 publications total with four or five publications in journals recognized by leaders in the
field.
The editorial section of the Australasian Journal of Educational Technology analyzed their
peer group of journals based on the Australian Research Council’s Tiers for the Australian
Ranking of Journals. Atkinson and McLoughlin (2008) divided the journals into four tiers
(A*, top 5%; A, next 15%; B, next 30%; and C, bottom 50%). The leading journals according
to their rankings include Computers & Education and the British Journal of Educational
Technology. Those classified as A journals include the Australasian Journal of Educational
Technology; Research in Learning Technology; Journal of Computer-Assisted Learning;
Australian Educational Computing; Educational Technology and Society; Journal of
Technology and Teacher Education; Technology, Pedagogy & Education; and Educational
Technology Research and Development.
Elbeck and Mandernach (2009) examined a subset of 46 journals in the field of educational
technology relating specifically to online education. In their study, they used several
                                             613
               Foundations of Learning and Instructional Design Technology
measures, including journal popularity (as measured by the number of Websites that link to
the journal Website), journal importance (as measured by Google’s page rank algorithm),
and journal prestige (as measured by journal editors) to rank order the journals that are
relevant to online educators. Using their classification scheme, five journals rank at the top,
including in order International Review of Research in Open and Distance Learning, Journal
of Asynchronous Learning Networks, eLearning Papers, Innovate: Journal of Online
Education, and The American Journal of Distance Education.
Outside of these publications, we were not able to identify studies that examined the
journals within the field of educational technology. Some of the older studies include
journals that are no longer in print or have changed names (Hannafin, 1991; Holcomb, Bray
& Dorr, 2003). For instance, the Journal of Instructional Development is no longer in print
and the Journal of Computing in Teacher Education has changed its name to the Journal of
Digital Learning in Teacher Education. Elbeck and Mandernach (2009) largely based their
classification on Web-analytics and to a lesser extent on the perceptions of professionals
within the field. Atkinson and McLoughlin (2008) provide a tier system, but do not illustrate
the system upon which those classifications are made. Put simply, more research is
necessary to investigate publishing within the field of educational technology.
Purpose
Publishing research articles plays an extremely important function for university faculty
members. Tenure and promotion decisions are greatly influenced by the perceived value of
publications. Further, emerging scholars in the field of educational technology need
guidance on where they should publish their research articles. Therefore, the purpose of our
survey is to answer two questions:
      What are the most academically prestigious and visible peer-reviewed publication
      venues in the field of educational technology?
      What factors influence one’s choice to publish in a journal in the field of educational
      technology?
Survey Method
Participants
                                              614
               Foundations of Learning and Instructional Design Technology
were visiting professors, lecturers, graduate students, or others. Those classified as other
included adjunct professors, teachers, retired professors, and program chairs. Eighty-one
percent of the sample came from respondents at doctoral granting universities. Though the
vast majority of the respondents were from the United States, other countries were
represented in the sample, including Finland, Australia, Greece, Portugal, and Oman.
        Position          n
Professor                9
Associate Professor      18
Assistant Professor      24
Visiting Professor       2
Post-Doctoral Associate 1
Lecturer                 1
Graduate Student         16
Other                    8
Instrument
This research necessitated the development of a survey that would (1) determine the most
academically prestigious and visible publication venues in the field of educational
technology, and (2) determine the most important factors relating to the choice of
publishing in a journal in the field of educational technology. The survey was split into three
sections: (1) background information, (2) factors relating to publication choice, and (3)
journals in the field. The background information section included variables like gender,
years in the field, academic classification, ethnicity, and research interests. The research
team compiled the factors relating to publication choice based on experience and the
literature (Bray, 2003; Carr-Chellman, 2006; Elbeck & Mandernach, 2009; Hannafin, 1991;
Holcomb, Bray, & Dorr, 2003; Price & Maushak, 2000). After interviewing three educational
technology faculty members, the factors were refined. The final list included 23 unique
items. The scale was a semantic differential from (1) not important to (5) very important.
This section had more than acceptable internal consistency reliability for these data at α =
.82.
The journals within the field section of the survey were compiled in four steps. First, the
journals listed in the study by Holcomb, Bray, and Dorr (2003) were included. Second, we
searched the Internet for related educational technology journals that were not included
within the list. Third, we used the Cabell (Cabell, 2007) listing of educational technology
journals to supplement our list. Finally, to assure the journals were peer-reviewed, we cross
referenced all journals using UlrichsWeb Global Serials Directory (2010) or the journal
Website. The final list included 59 unique journals related to the field of educational
technology. The scale ranged from 1 to 10 with 1 = Never heard of journal, 2 = Low
academic prestige, and 10 = High academic prestige. The section demonstrated acceptable
                                             615
               Foundations of Learning and Instructional Design Technology
internal consistency reliability with a Cronbach alpha at α = .96. The final complete survey
was reviewed by four educational technology faculty members for clarity and usability and
was deemed acceptable for use.
Procedures
The instrument was made accessible in a Web-based format using LimeSurvey. The
researchers made arrangements to send the survey to three educational technology
listservs: the AECT members’ listserv, the ITFORUM listserv, and the AERA Special Interest
Group on Instructional Technology member listserv. Because the survey was sent to three
different listservs with cross membership, exact response rates cannot be calculated. The
data were collected in November of 2010 and a three week window was left open for
respondents to complete the survey. Respondents of the survey were informed that the
purpose of the research was: (1) to advance the field of educational technology by
determining the most academically prestigious and visible publication venues in the field,
and (2) to determine the most important factors relating to the choice of publishing in a
journal in the field of educational technology. Finally, the data were analyzed using
descriptive statistics.
Results
Our first research question was “What are the most academically prestigious and visible
publication venues in the field of educational technology?” We answer this question by
evaluating several different criteria related to journals within the field of educational
technology. These criteria include the journal visibility, journal prestige, open access,
impact factor scores, and the acceptance rates of the journals.
Journal Visibility
                                              616
               Foundations of Learning and Instructional Design Technology
4      TechTrends                                                          88.61
5      Journal of Computing in Higher Education                            87.34
6      Australasian Journal of Educational Technology                      84.81
7      Journal of Distance Education                                       84.81
8      Distance Education: An International Journal                        83.54
9      Association of the Advancement of Computing in Education Journal 82.28
10     Journal of Research on Technology in Education                      82.28
Journal Prestige
Rank                         Journal                         M   SD
1      Educational Technology Research and Development 8.63 2.38
2      British Journal of Educational Technology           7.52 2.51
3      Computers & Education                               6.59 2.89
4      Distance Education: An International Journal        6.05 2.76
5      The American Journal of Distance Education          6.05 3.17
6      Journal of Research on Technology in Education      6.03 3.09
7      Journal of Computing in Higher Education            5.92 2.62
8      Journal of Distance Education                       5.84 2.73
9      Journal of Educational Technology and Society       5.75 3.03
10     Cognition and Instruction                           5.68 3.18
Open Access Journals
Open access journals have grown in popularity since the emergence of the World Wide Web.
Several of the journals in the field of educational technology are now open access.
Appendix A shows 22 open access journals related to the field of educational technology.
Notably, two of the top ten journals as measured by journal prestige are open access
journals: Journal of Distance Education and Journal of Educational Technology and Society.
In general, however, it would appear that traditional closed access journals command a
higher level of prestige than do open access journals in the field of educational technology.
                                            617
               Foundations of Learning and Instructional Design Technology
Acceptance Rates
Acceptance rates are also an important consideration when evaluating a journal. We have
compiled the acceptance rates of journals listed in Cabell’s directory (Cabell, 2002a; Cabell,
2002b; Cabell, 2007). The results are shown in Appendix A. It appears that the lowest
acceptance rates for our journals are somewhere in the range of 11–20%. These journals
include Association of the Advancement of Computing in Education Journal, British Journal
of Educational Technology, Cognition and Instruction, Contemporary Educational
Psychology, Educational Technology Research and Development, Informing Science,
International Journal on E-Learning, International Review of Research in Open and Distance
Learning, Journal of Educational Computing Research, Journal of Educational Multimedia
and Hypermedia, Journal of Educational Technology and Society, Journal of Interactive
Online Learning, Journal of Research on Technology in Education, and The American Journal
of Distance Education.
Though impact factor scores have been critiqued within the domain of education (Togia &
Tsigilis, 2006), they still remain an important factor when evaluating the relative importance
of a journal. The problem within the field of educational technology is that only a handful of
our journals have impact scores calculated. Out of the 59 journals examined within this
study, only 14 have impact factor scores. These journals and their 2010 impact factor scores
are shown in Table 4 ordered by impact factor score. As can be gleaned, Computers &
Education and British Journal of Educational Technology have the highest impact factor
scores among the impact factor scored journals. Notably, the median impact factor for the
184 journals in the subject category “Education & Educational Research” is 0.649 (Web of
Knowledge, 2012). All the journals that we categorized as educational technology are well
above that score, with the exception of the Journal of Educational Computing Research.
                                               618
               Foundations of Learning and Instructional Design Technology
Survey respondents also had the option of providing additional journals in a free-form
response. Other journals included The Journal of the Learning Sciences; Educational
Technology (magazine); International Journal of Computer-Supported Collaborative
Learning; Journal of Science Education and Technology; Educational Researcher; IEEE
Spectrum; Journal of Computers in Mathematics and Science Teaching; Technology,
Pedagogy, and Education; Learning and Leading with Technology; and Journal of Learning
Design.
Our second research question centers on “What factors influence one’s choice to publish in
a journal in the field of educational technology?” The decision to publish in a specific journal
in educational technology might be influenced by several factors. These factors are
illustrated in Table 5 along with their relative importance as rated by individuals who
responded to the survey. The items are ordered by the mean responses to the scale on the
survey. According to the respondents, the four most important factors to consider when
publishing in a journal include the fit of the manuscript in the journal, the aims and intent of
the journal, the target audience of the journal, and the language of the journal. The least
important three factors include the publication frequency of the journal, the publisher of the
journal, and the price of the journal.
                                 Factor                                  M    SD
Fit of the manuscript in the journal                                    4.66 0.62
Aims and intent of journal                                              4.54 0.62
Target audience of journal                                              4.32 0.67
Language of the journal                                                 3.85 1.18
Speed of peer-review process for the journal                            3.81 0.89
Acceptance rate of journal                                              3.76 1.06
Accessibility of journal (e.g., open access)                            3.62 1.22
Ranking of the journal                                                  3.59 1.26
Indexing of the journal (e.g., SSCI)                                    3.54 1.14
Impact factor score of journal                                          3.46 1.26
                                               619
               Foundations of Learning and Instructional Design Technology
Survey respondents also had the option of providing additional factors in a free-form
response. Other factors included whether or not the journal is listed in Cabell’s directories,
the quality of feedback provided in a timely manner by the journal, journal’s citation style
requirements (e.g., APA), the impact of journal on practice, the journal’s credibility to the
field, and word length or submission requirements.
Discussion of Results
Interpretation of our results must be viewed within the limitations of this study. This study
was based on an online survey sent to three leading educational technology listservs.
Because of the potential for cross-listings, response rates could not be calculated. The
survey was designed from a compilation of journals that may not represent all peer-
reviewed journals within the field. For example, we failed to include The Journal of the
Learning Sciences and Journal of Computers in Mathematics and Science Teaching, which
are arguably leading publications in the field. Also, our sample only included peer-reviewed
publications, so respectable publication outlets like Educational Technologymagazine were
not included by design. This limits the generalizability of the results. Also, our sample
represents primarily university faculty members, and thus, does not represent the
practitioners within our field. Finally, the results are limited to the expert judgment and
candor of the respondents.
                                               620
               Foundations of Learning and Instructional Design Technology
requires further investigation; however, it might be useful to see if the factors considered
important by scholars seeking to publish their own works may be connected to their choice
of journal.
Our results also provide some helpful contextual information about what factors influence
an individual to publish in a particular journal. Respondents suggest that factors like the fit
of the manuscript in the journal, the aims and intent of the journal, the target audience of
the journal, the language of the journal, and the speed of the peer-review process of the
journal are all important factors. Much less important to the respondents was the price of
the journal, the publisher of the journal, and the publication frequency of the journal. These
results suggest that several factors influence one’s choice to publish in a journal.
For scholars attempting to better understand the value of particular publications in the field
of educational technology, such findings provide a gauge for better assessing the broader
impact an educational technology scholar’s work has in the field. This is important for those
educational technology scholars seeking tenure in departments and colleges where
educational technology scholarship may not be well understood. Survey findings also offer
the publishers of educational technology journals feedback in terms of how their market
perceives their products. Such information is still useful for an educational technology
publication’s editorial and marketing departments.
An area of further research includes a deeper investigation into the role openness plays in
an academic publication’s perceived value to the field. Given the growth and adoption of
new digital technologies and open educational resources, open academic journals provide
easy access and broader dissemination opportunities for scholars in all fields of research.
Our results suggest that open access journals can still be ranked among the most
prestigious (Journal of Distance Education and Journal of Educational Technology and
Society). However, more empirical research is necessary to confirm our findings.
                                              621
               Foundations of Learning and Instructional Design Technology
Application Exercises
          The article lists many places that you can publish your research. Find a
          journal/organization and do a little research online. What is the general
          mission of the organization? What is the procedure to get published?
References
Atkinson, R., & Mcloughlin, C. (2008). Editorial: Blood, sweat, and four tiers revisited.
Australian Journal of Educational Technology, 24(4); http://www.ascilite.org.au/ajet/
ajet24/editorial24-4.html .
Bray, K. E. (2003). Perceived value of academic journals for academic prestige, general
reading, and classroom use: A study of journals in educational and instructional technology
(unpublished doctoral dissertation). University of North Texas, Denton, TX.
Hardgrave, B, C., & Walstrom, K. A. (1997). Forums for MIS scholars. Communications of
the ACM, 40(11), 119–124.
                                              622
               Foundations of Learning and Instructional Design Technology
53–57.
Price, R. V., & Maushak, N. J. (2000). Publishing in the field of educational technology:
Getting started. Educational Technology, 40(4), 47–52.
Togia, A., & Tsigilis, N. (2006). Impact factor and education journals: A critical examination
and analysis. International Journal of Educational Research, 45(6), 362–379.
                                              623
               Foundations of Learning and Instructional Design Technology
                                          624
               Foundations of Learning and Instructional Design Technology
International Journal of
Instructional Technology and         Yes    4.10      68.35     –            No
Distance Learning
Journal of Educational Technology
                                  No        3.82      62.03     70%          No
Systems
Journal of Online Learning and
                                     Yes    3.82      64.56     45%          No
Teaching
Journal of Technology Education      Yes    3.78      64.56     –            No
International Journal on E-
                                     No     3.77      62.03     11–20%       No
Learning
Educational Media International      No     3.64      63.29     –            No
Electronic Journal of E-Learning     Yes    3.59      64.56     50%          No
Online Journal of Distance
                                     Yes    3.58      64.56     30%          No
Learning Administration
Computers in the Schools             No     3.54      64.56     40–50%       No
International Journal of
                                     No     3.53      60.76     –            No
Instructional Media
Journal of Interactive Learning
                                     No     3.47      56.96     –            No
Research
Learning, Media, and Technology      No     3.46      63.29     –            No
Journal of Technology, Learning,
                                     Yes    3.34      60.76     –            No
and Assessment
Electronic Journal for the
Integration of Technology in         Yes    3.25      59.49     21–30%       No
Education
Journal of Interactive Online
                                     Yes    3.19      56.96     11–20%       No
Learning
Education and Information
                                     No     3.10      50.63     –            No
Technologies
Interdisciplinary Journal of e-
                                     Yes    3.06      50.63     11–20%       No
Learning and Learning Objects
Journal of Interactive Media in
                                     Yes    2.97      50.63     60%          No
Education
Turkish Online Journal of Distance
                                   Yes      2.92      53.16     60–70%       No
Education
Turkish Online Journal of
                                     Yes    2.86      50.63     35–45%       No
Educational Technology
Journal of Interactive Instruction
                                     No     2.71      45.57     –            No
Development
Journal of Educators Online          Yes    2.63      48.10     20%          No
Journal of Instruction Delivery
                                     No     2.41      43.04     –            No
Systems
                                           625
               Foundations of Learning and Instructional Design Technology
    To learn more about what journal reviewers are looking for as they review your
    manuscript, see this series of videos from the Journal of the Learning Sciences
    [https://edtechbooks.org/-RaK].
Suggested Citation
                                            626
                       Albert D. Ritzhaupt
                                    627
                    Christopher D. Sessums
                                     628
                      Margeaux C. Johnson
                                      629
                                            45
Editor’s Note
We argue that high-quality publication outlets demonstrate three characteristics. First, they
are rigorous, i.e., discerning, critical, and selective in their evaluations of scholarship.
Second, they have influence on others in that they are read, cited, and used. Third, by being
prestigious, they are well known to other scholars and practitioners, increasing the prestige
of the authors they publish and bringing more light and attention to their work and their
institutions. These three criteria—rigor, influence, and prestige—have the potential to
create a more holistic assessment of the value of a body of scholarly work.
                                            630
               Foundations of Learning and Instructional Design Technology
Rigor
High-quality journals are rigorous, meaning they are more critical in their reviews, are more
discerning about what they will accept and publish, and apply higher standards for judging
quality research than other journals. They question all aspects of an academic study,
including theoretical foundations, participant sampling, instrumentation, data collection,
data analysis, conclusion viability, and social impact. They make decisions about the quality
of research on its own merits: i.e., through blind review by distinguished and experienced
peers and editors. Being published in a rigorous journal lends credibility and acceptance to
the research because it indicates that the author(s) have successfully persuaded expert
scholars of the merits of the article.
When evaluating the rigor of a journal, authors often consider the acceptance rate as a key
indicator. However, judgments based solely on acceptance rates must be made with care
because journals calculate their rates differently. Additionally, a lower-tier journal may
receive lower-tier quality manuscripts and accept very few of them, resulting in a low
acceptance rate but still poor quality publications. Despite these issues, the journal’s
acceptance rate may be documented as one measure of rigor. Other indicators of rigor
might include a policy of double blind peer review, the number of reviewers, and the
expertise and skill of these reviewers and the editorial board, who determine how
discerning, rigorous, and selective the journal will be.
Editors are especially of primary importance, as they resolve contradictory reviews and
make final determinations of scholarship quality.
Many indicators of rigor are currently already documented and ought to be considered
when evaluating the quality of a publication outlet. For example, acceptance rates, review
policies, and the number of reviewers may be found on the journal’s website or through
bibliographic sources such as Cabell’s Directories (http://www.cabells.com/). It is much
harder to document the rigor of the reviewers and editors, and this is ultimately a subjective
interpretation. Like all subjective decisions, the best method of verification would be to seek
opinions of other qualified scholars in the field to confirm or deny your own.
In collecting evidence of the rigor of a publication outlet, we believe the following questions
might be useful:
      How does the acceptance rate compare with other journals in this specific discipline?
      How is the acceptance rate calculated, if known?
      What type of peer review is used? Is it editorial, blind, or double-blind? How many
      reviewers are used to make decisions?
      What is known about the quality of the reviewers and editorial board? Are they
      recognizable to other experts in the field and known for their insights into the
      research? How rigorous would outside experts believe these reviewers and editors to
      be?
                                              631
               Foundations of Learning and Instructional Design Technology
Influence
Influence refers to how extensively individual manuscripts and publications are referenced
by other publications and how much they contribute to the scholarly progress of a
discipline. In this article, we are referring only to influence on research and theory
development, not on actual practice. Undoubtedly influence on practitioners is an important
quality of good scholarship, as it could be argued that true impact is only felt on the
practitioner level. However, we do not address practitioner impact, because this framework
is focused on criteria for evaluating academic research and theory publications. We can
conceive of the possibility of another framework being developed to guide the evaluation of
how much influence an academic has on actual practices, with different evidence being
presented and analyzed, but that is beyond the scope of this article.
Some non-peer-reviewed outlets have greater influence than those that are peer reviewed.
For example, publication in a widely read and cited practitioner outlet can have high
influence. In addition, a Publish or Perish search reveals that some highly cited books are
                                             632
               Foundations of Learning and Instructional Design Technology
more highly cited in Google Scholar than many top journals. Thus while peer review would
be a prime indicator of the rigor of a journal, non-peer-reviewed outlets may be able to show
high influence, indicating they still have value. This also shows the need to triangulate
findings from all three criteria.
      Is the publication indexed in ISI or Scopus? If so, what is the impact rating (ISI) or
      citation count, h-index, and SCImago Journal Ranking (Scopus)?
      What are the impact ratings according to Publish or Perish? Here we believe it is
      useful to use the same time window as that used by ISI or Scopus. So for example, if
      you typically use the 5-year ISI Impact Factor, then it would be wise to also limit your
      Publish or Perish search criteria to the last five years to retrieve comparable statistics.
      What is the open-access policy of the publication outlet? Outlets that embrace open-
      access delivery have the potential to have more influence, as the articles are more
      easily found through Internet search engines. However, the open-access nature of a
      publication outlet is only an indicator that it has potential for greater influence, not
      that it has necessarily achieved this influence.
      What is the circulation of the publication outlet? This is also only an indicator of the
      potential for influence, as many journals are packaged and sold as bundles to libraries,
      increasing circulation but not necessarily influence. However, greater circulation does
      indicate the potential for higher viewership and greater influence.
      Is there any indication that the publication has influence on other scholars? For
      example, is the book widely adopted as a text for university courses? Is there evidence
      that the journal is frequently used to influence policy or other research?
Prestige
Prestige is a qualitative judgment about the respect a scholar receives for publishing in a
particular outlet. Because it is more qualitative, it is more difficult to evaluate in a
promotion dossier or grant application and is perhaps largely a theoretical exercise where
scholars honestly question the perceived prestige of a journal where they are considering
publication. A possible indication of the prestige of a journal is whether other researchers
recognize the journal when asked and whether their intuitive perception is that the journal
is of high quality. For example, in the overall field of education, publishing in the Review of
Educational Research or the Review of Higher Education is highly regarded because these
are prestigious journals, sponsored by major professional organizations, and well known
among educational scholars from all disciplines.
More quantifiable and objective measures of prestige might be rigorous surveys of scholars
in a discipline to gauge their perception of a publication outlet. As an example, several
studies have surveyed researchers in educational technology about publications they
recognize, read, and respect (e.g. Holcomb, Bray, & Dorr, 2003; Orey, Jones, & Branch,
2010; Ritzhaupt, Sessums, & Johnson, 2011). These studies provide valuable information on
                                              633
               Foundations of Learning and Instructional Design Technology
the relative prestige of a publication outlet. Other indicators of prestige may be whether the
publication outlet is officially sponsored by a large national or international professional
organization, whether the publisher is reputable, and whether the editor and editorial board
are well known and respected.
Often prestige alone is used to evaluate the quality of a journal, but this can be faulty since
journals rise and fall in relative quality and because prestige is often so subjective. Thus
many journals that were highly prestigious 10–20 years ago might still be well known even
though their rigor and influence have fallen, and new journals that are perhaps not yet well
known may still be publishing high-quality research. Prestige, then, can be only one
indicator of the quality of the journal to be considered in relation to the other indicators.
In making and then defending our own decisions about where to publish our work, we have
attempted to apply these criteria qualitatively—using the metrics and data to inform an
inductive decision based on evidence from all three categories. We have found that those
outside our field have found it easier to understand our choices because we can justify them
by providing data about the relative rigor, influence, and prestige of a particular publication
outlet in comparison with other publication outlets in the discipline. This framework has
also been helpful within our School of Education, where multiple departments are housed,
but where we often need to explain to each other the relative importance of different
publication outlets within our specific disciplines. As we sought a framework that would
encompass all of the scholarship being conducted within the School, the principles of rigor,
influence, and prestige have proven flexible enough to provide a common language that all
departments could use, even though the specific pieces of evidence important in each of
their disciplines were unique and nuanced.
The following are a few examples of how these criteria could be applied in describing a
variety of different publication outlets. Using publications in our own field, we demonstrate
how this framework might be used (see Table 1). We have masked the names of the journals
to focus our discussion on the framework and evaluation criteria, not the specific ranking of
individual journals.
                                              634
               Foundations of Learning and Instructional Design Technology
The difficulty may come in scoring publications #2, #3, and #5. The rigor of #2 appears to
be fairly staunch, but it is reviewed only by the editor. However, in relation to its peers, this
journal seems to have strong citation numbers. This particular journal is often left out of
consideration of measures of prestige because of its lack of blind peer review (Ritzhaupt et
al., 2011). However, the leaders in the field regularly use this publication outlet as a venue
for publishing new ideas and theories, and consequently this publication is one of the most
read in our field (Holcomb et al., 2003). Taken individually, each of the measures we used to
rate this publication could be problematic for an external review panel unfamiliar with our
field. Taken together, we might rate rigor as mediocre, impact as high, and prestige as high,
resulting in an upper, mid-tier publication.
Publication #3 paints a different picture. It has a respectably stringent acceptance rate, but
the number of times each article is cited in Google Scholar is low. This may be due to the
fact that this publication is viewed as a practitioner journal within our field; and, as such,
practitioners are more likely to apply the theories than they are to cite them. Also, in
addition to regular research articles, this journal publishes many non-research articles and
                                              635
                Foundations of Learning and Instructional Design Technology
columns, geared towards informing the members of our professional association. These
shorter pieces are indexed in Google Scholar and likely bring down the overall ratio of
citations per paper. Finally, this particular journal enjoys high prestige as demonstrated in a
survey of important journals in the field, ranking in the top 10 overall. Combining these
criteria, our qualitative judgment would be to rate this as a lower, mid-tier publication.
Conclusions
We emphasize that these ideas constitute a proposed theoretical framework for how
scholars could make and justify, to those from other disciplines, decisions about where they
choose to publish their research. In practice, scholars would still need to engage various
sources of data and make sound and well-reasoned arguments for the quality of their
publication choices. Even though final judgments about journal quality remain a subjective
decision, the framework responds to several of the needs that we identified in current
efforts to evaluate the academic quality of publication venues. It is flexible enough to allow
for multiple and varied sources of data within the categories of rigor, influence, and
prestige. As such, the framework allows for the timely inclusion of new metrics as novel
ways of measuring academic quality emerge or evolve. The inclusion of multiple indicators
allows the framework to be applied to different disciplines. Finally, it is impossible to use
the framework while depending on a single metric as an indicator of quality, which may help
scholars avoid this dangerous trap. We do not advocate joining the many indicators into a
single metric as that would mask the diverse ways in which a publication contributes to
quality scholarship. We also emphasize that this framework provides a common language
that can benefit scholars in justifying their publication decisions and assist promotion
committees in knowing what questions to ask about a candidate’s publication record.
Instead of simply asking what a journal’s impact factor is, we hope that committees would
seek or request information on the rigor, influence, and prestige of a candidate’s publication
record, leading to a more holistic and accurate assessment.
We welcome discussion about whether these three criteria are the most useful and accurate
in evaluating educational technology publication outlets or whether additional criteria might
be added to the framework. Engaging in this discussion is critical. If we cannot clearly
articulate the criteria for determining the quality of our publication outlets, then others (i.e.,
promotion committees and funding agencies) will have to draw their own conclusions using
metrics and criteria that may be less useful or even inapplicable to our disciplines. Also, we
                                               636
               Foundations of Learning and Instructional Design Technology
emphasize that we believe these criteria should be applied flexibly, qualitatively, and
intelligently in making decisions about scholarship quality. We do not recommend using
these criteria uncritically to generate a ranking of journals that “count” and “do not count”
since all of these data points can be skewed, manipulated, or changed from year to year.
Still, by intelligently triangulating multiple data points, we can make more holistic
judgments on the quality of publication outlets and share a terminology for discussing our
publication decisions.
Application Exercises
          Find an academic journal and use the framework from this chapter to assess
          its rigor, influence, and prestige. Based on its merits, would you consider the
          journal you have found to be a top-tier journal? Explain.
References
Corby, K. (2001). Method or madness? Educational research and citation prestige. Portal:
Libraries and the Academy, 1(3), 279–288. doi:10.1353/pla.2001.0040
Orey, M., Jones, S. A., & Branch, R. M. (2010). Educational media and technology yearbook.
Vol. 35 (illustrated ed.). New York, NY: Springer.
Ritzhaupt, A. D., Sessums, C., & Johnson, M. (2011, November). Where should educational
technologists publish? An examination of journals within the field. Paper presented at the
Association of Educational Communications and Technology, Jacksonville, FL., USA.
                                             637
          Foundations of Learning and Instructional Design Technology
Suggested Citation
                                       638
                             Peter J. Rich
                                     639
                           Richard E. West
He tweets @richardewest, and his research can be found on Google Scholar and
his website: http://richardewest.com.
                                       640
                                            47
Networking at Conferences
Editor’s Note
    The following chapter is a combination of two ect Cornerstone articles written for
    TechTrends: “An Academic Experience of a Lifetime!” by Jered Borup and
    “Internship Reflection” by Abigail Hawkins.
Charles Graham (faculty member at Brigham Young University) once stated that what
happens in conference hallways is often more valuable than what happens in the sessions.
When attending a conference, you can meet amazing people and form relationships you
never thought possible. The following networking strategies are for other graduates who
may have felt peripheral and out of place at academic conferences. Our advice is simple:
insert yourself into the scene. We would like to share three ways that any and all graduate
students can do just that and make the most of their time at a conference: stand tall, shake
hands, and get organized.
Stand Tall
This isn’t an encouragement to improve your posture but to make the most of your
opportunities. There are four ways for graduate students to stand tall at conferences.
Be confident. The former Saturday Night Live comedian Al Franken had a recurring
character named Stuart Smalley. In every sketch, Stuart would look in the mirror and
confidently say, “I’m good enough, I’m smart enough, and doggone it, people like me!”
While we don’t advocate that you chant this affirmation while at a conference, you wouldn’t
                                             641
               Foundations of Learning and Instructional Design Technology
be wrong if you did. It will not take you long before you realize that the organization values
graduate students and there is no reason not to confidently stand tall as a graduate student
knowing that “You’re good enough, you’re smart enough, and doggone it, people like you!”
Having that knowledge is critical to making the most of your time there.
Participate. It is easy for new graduate students to feel that they are not able to make a
meaningful contribution. This simply is not true. If possible, you should submit a research
proposal and present your work. If your research is not developed enough for a full-paper
presentation, submit it as a roundtable or poster presentation. If you don’t have anything to
present, you can still ask questions or make comments at the sessions you attend.
Apply for awards. Look for awards supported by the organizations that sponsor the
conference you are attending. There are likely several. We would recommend applying first
for the ones that are specifically for graduate students. You can also ask your advisor or
another faculty member familiar with the conference for advice on what awards you should
apply for.
Give service. There are lots of ways that graduate students can give service. You may want
to consider volunteering at the conference. It can be a good way to become familiar with the
organization and meet new people. You should also try to attend one or two division
meetings. At the division meeting, they will look for volunteers to help the division.
Reviewing presentation submissions can be a great way to serve the division and learn what
makes a good proposal.
Sessions. Researchers love to talk about their research. After attending a session, stick
around and talk with those who are still buzzing. Listen. Ask questions. Share ideas.
Exchange cards. Become a part of the larger conversation and your research community.
You’ll find that some of the best conversations happen after the formal presentations are
over. Poster sessions and roundtables are also great opportunities to actively discuss
interesting topics.
Receptions. Each conference is different, but some organize receptions that are specifically
designed to help people get to know each other and network. Don’t miss them! If this is your
first time at the conference, it may be helpful if you went with an advisor or another faculty
member from your department. They will be able to introduce you to new people until you
                                             642
               Foundations of Learning and Instructional Design Technology
Activities. There are several planned activities to help you get out there and shake some
hands. Some are free with your registration, and others cost a little extra but are worth the
money. For instance, Jered made some of his best memories on a riverboat cruise.
Meals. It’s not uncommon for graduate students to try to eat cheap and save money during
their time at a conference. Money can be tight for graduate students, but being too frugal
can cost you. Worry more about who you are eating with than the cost of the meal. For
example, attend that pricey division luncheon. You’ll sit at a table with eight other people
interested in an area of research similar to your own. You will make friends, comment on
how horrible the food is, and learn the inner workings of the division. Remember the service
advice from earlier? After the lunch would be a great time to ask one of the division leaders
if there is any volunteer division work you could participate in during the year.
The job board. If you are on the job hunt, you should take a look at the job board. You can
post your vitae and see the jobs that are available. Typically the postings will have a contact
number. Don’t hesitate to call, text, or email the contacts for the jobs that you are interested
in to set up a time to talk at the conference.
Faculty. Don’t be afraid to ask for help from your faculty. They are, not surprisingly, more
familiar with the conference and other attendees and can introduce you to people they
know. After her second day at the conference, Abigail pinged Dr. David Wiley asking if he
would introduce her to people the next morning. He was more than accommodating and
introduced her to several individuals and potential employers. Similarly, Dr. Rick West
introduced her to several faculty members who were looking to hire.
Peers. While you’re making bold moves and shaking hands with big names in the field, be
inclusive by inviting other graduate students to join you for lunch. Introduce one another to
people you know. Fellow participants in the conference that Abigail attended, Heather
Leary, Eunjung Oh, and Nari Kim, all introduced Abigail to faculty from their departments.
Similarly, she introduced them to faculty from hers. It was a simple, kind, and easy way to
meet others through the use of a peer network.
Yourself. Do the uncomfortable. For instance, would you be horrified if we told you to invite
yourself to lunch with someone? It is a common mindset that one waits for an invitation.
However, it is completely normal at a conference to ask if people have lunch plans and if you
could join them (or if they would join you). So after lingering at a session and meeting
people with similar research interests, be bold and ask if they have lunch plans. Make the
first move. You’ll be surprised by the outcome.
Get Organized
Being unorganized is a sure way to miss great opportunities. We’ll look at three phases of
organization: before, during, and after.
                                              643
               Foundations of Learning and Instructional Design Technology
Before the conference. Plan before you go. You should start organizing and preparing
long before the conference actually starts. First, identify the sessions that you would like to
attend. Remember, who is presenting is just as important as what they are presenting.
Ideally, you would be familiar with the presenters’ work and their ideas relevant to your
research. You can also contact individuals you would like to meet in advance and ask if you
could take them out for coffee or chat with them during a session break. If you are not sure
whom to meet, ask your advisor. Have questions prepared to ask about their research and
how it relates to your own. Second, clear your plate of your other responsibilities. You want
to avoid grading assignments or working on class assignments during the conference. Third,
get some business cards. You are probably thinking, “But I’m just a graduate student.” And?
If you are teaching or a research assistant, ask your department secretary if you can get
cards made with the university logo. Also, print some copies of your vitae and sample
publications. These are especially important if you are on the job hunt. Lastly, join the
Facebook groups for the assemblies and divisions that you care most about (especially any
that are for graduate students). It will help you get a pulse for the community and be aware
of important events.
During the conference. It can be easy to get a little disorganized at the conference. Two
strategies may help. First, when you receive a business card (and pass out one of your own
at the same time), write on the back the person’s research area or employment, what you
were talking about, and anything that you would like to follow up on. It would also be
helpful to jot down one personal fact you can recall from the conversation. Second, carry a
pocket-sized notebook for note-keeping. If you don’t write down your ideas, you may forget
them.
After the conference. Don’t just put the business cards you collected or the notes that you
took in a drawer and forget about them. Instead follow up on the conversations that you
had, invite people to join your LinkedIn and other social networks, and actually read the
articles that you told yourself you would. It’s also a good idea to email those who helped you
at the conference and thank them.
Conclusion
If you are a graduate student who is considering attending a conference—do it! And
remember to stand tall, shake hands, and be organized.
                                             644
          Foundations of Learning and Instructional Design Technology
Application Exercises
     After reading this article, find a professor or another graduate student who
     has attended a conference. Ask them about their advice for attending
     conferences.
     Reflect on how you would or will prepare to make the most of a conference.
     What would you bring. Who would you want to talk to?
Suggested Citation
                                       645
                               Jered Borup
                                       646
                           Abby Hawkins
                                     647
                              Tanya Gheen
                                        648
          VI. Preparing for an LIDT Career
Two of the most common questions students have when they enter an LIDT graduate
program is what kinds of careers will be available to them after graduation, and how can
they prepare for those careers? This section focuses on answering those questions. There
are many more careers possible with an LIDT degree than those represented here, and
additional chapters may be added in the future. The next question is for you to answer:
What type of career do you want to have?
                                           649
                                            49
Editor’s Note
    Sugar, W., Brown, A., Daniels, L., & Hoard, B. (2011). Instructional design and
    technology professionals in higher education: Multimedia production knowledge
    and skills identified from a Delphi study. Journal of Applied Instructional Design,
    1(2), 30–46. Retrieved from https://edtechbooks.org/-PD
As Instructional Design and Technology (ID&T) educators, we have made considerable effort
in understanding the specific multimedia production knowledge and skills required of entry-
level professionals. Our previous studies (Sugar, Brown, & Daniels, 2009) documented
specific multimedia production skills, knowledge and software applications (e.g., Flash) that
ID&T students and subsequent graduates need to exhibit. As a result of these efforts,
differences can be readily distinguished between instructional designers working in
corporate settings and those working in higher education settings (Sugar, Hoard,Brown, &
Daniels, 2011). Kirschner, van Merrienboer, Sloep, and Carr (2002) observed that
instructional designers at higher education settings focus on identifying alternative
solutions for a particular course whereas instructional designers within a corporate training
setting are more customer-oriented. Larson and Lockee (2009) concurred with this
assessment by noting “differences in the requirements listed for business and industry
versus higher education jobs” (p. 2). Essentially, the organizational culture (e.g., shared
beliefs and values) within a corporation is radically different than that which is found within
a college or university setting. Since over 89% of our initial survey respondents (e.g., Sugar,
Brown, & Daniels, 2009) worked in colleges or universities, we decided to concentrate our
efforts exclusively on the multimedia production knowledge and skills of instructional
                                             650
               Foundations of Learning and Instructional Design Technology
The prominence of the instructional designer within higher education settings also has been
well documented (Shibley, Amaral, Shank, & Shibley, 2011). Incorporating a continuous
improvement process (Wolf, 2007), encouraging higher education faculty with innovative
reward and recognition structures (Bluteau & Krumins, 2008), and the importance of
interacting with faculty peers (Nicolle & Lou, 2008) are examples of current best practices
in facilitating successful technology adoption and integration. Considerable effort in
understanding how higher education faculty adopt e-Learning activities (e.g., MacKeogh
&Fox, 2009), Web 2.0 technologies (e.g., Samarawickrema, Benson, & Brack, 2010), as well
as faculty members’ perceptions of roles of Learning Content Management Systems (LCMS)
(e.g., Steel, 2009) have been recently initiated as well.
Purpose of Study
The intent of this study is to better comprehend the instructional designer’s role in higher
education settings. Specifically, we sought to interpret multimedia production knowledge
and skills required of Instructional Design and Technology professionals working in higher
education. In addition, since we noted a definite interrelationship between multimedia
production and instructional design skills in earlier studies (Sugar, Brown, & Daniels, 2009),
we also sought to understand the relationship between these two skill sets. To accomplish
this goal, we conducted a Delphi study, seeking the opinions and consensus of experienced
instructional designers who work in higher education.
                                             651
               Foundations of Learning and Instructional Design Technology
Method
We determined that a Delphi research methodology was the best approach to address our
questions. In the early 1950’s, “Project Delphi” was developed from an Air Force-sponsored
Rand Corporation study. This study sought to “obtain the most reliable consensus of opinion
of a group of experts . . . by a series of intensive questionnaires interspersed with controlled
opinion feedback” (Linstone & Turoff, 2002, p. 10). Delphi panelists remain anonymous to
each other in order to avoid the “bandwagon effect” and ensure individual panelists do not
dominate a particular decision (Linstone & Turoff, 2002). Ideally, the Delphi panel is
heterogeneous; clearly representing a wide selection of the targeted group. Since the
inception of Project Delphi, the Delphi technique has been a prescribed methodology for a
wide variety of content areas, including government planning, medical issues, and drug
abuse-related policy making (Linstone & Turoff, 2002). Several existing Instructional Design
and Technology research studies utilized the Delphi method to examine phenomena such as:
determining constructivist-based distance learning strategies for school teachers (Herring,
2004); understanding strategies that promote social connectedness in online learning
environments (Slagter van Tryon & Bishop, 2006); best practices for using technology in
high schools (Clark, 2006); optimal technology integration in adult literacy classrooms
(Dillon-Marable & Valentine, 2006); and forecasting how blended learning approaches can
be used in computer-supported collaborative learning environments (So & Bonk, 2010). The
Delphi method has also been used to identify priorities from a select group of experts on
topics that include K–12 distance education research, policies, and practices (Rice, 2009);
mobile learning technologies (Kurubacak, 2007); and educational technology research needs
(Pollard & Pollard, 2004).
Standards have also been determined from Delphi studies. Researchers used this method to
ascertain effective project manager competencies (Brill, Bishop, & Walker, 2006),
biotechnology knowledge and skills for technology education teachers (Scott, Washer, &
Wright, 2006), and assistive technology knowledge and skills for special education teachers
(Smith, Kelley, Maushak, Griffin-Shirley, & Lan, 2009).
The Delphi method provides researchers with the ability to systematically evaluate the
expert decision-making process within a prescribed set of phases. This process is
particularly advantageous for those participants or Delphi panelists who are in separate
physical locations (Linstone & Turoff, 1975), as our participants were.
                                              652
               Foundations of Learning and Instructional Design Technology
Delphi Panel
For our Delphi study, fourteen Instructional Design and Technology professionals originally
agreed to participate. Ultimately, eleven of the fourteen original panelists completed all
three data collection phases of the study; three individuals stopped participating for various
personal reasons. The overall goal was to gather responses from a heterogeneous grouping
of panelists (see Table 1) representing higher education work environments in general. The
seven female and four male panelists work in a variety of higher education settings,
including two-year colleges, four-year universities, public institutions, and private
institutions. Eight of our panelists represent public institutions and three represent private
institutions. In addition, two panelists represent two-year community colleges and four
represent undergraduate-only institutions. Nine of our panelists work in administrative
positions (e.g., Director) and two of our panelists work as instructional designers for their
respective institutions. Ten panelists have worked in higher education setting for more than
ten years. The average amount of higher education work experience was over sixteen years.
The panelists are geographically diverse, representing western, mountain west, mid-west,
south, southeast, mid-Atlantic, and northeast regions of the United States. One panelist
works at a higher education institution in Switzerland.
                            Years in
                            higher
Gender Position                           Region          Type of institution
                            education
                            setting
         Instructional                                    Public; 4-year degree;
Female                      10            West
         Designer                                         Undergraduate & graduate
         Instructional                                    Public; 4-year degree;
Female                      12            Mountain West
         Designer                                         Undergraduate & graduate
                                                          Public; 4-year degree;
Female Coordinator          4             Northeast
                                                          Undergraduate & graduate
                                                          Public; 2-year degree;
Female Coordinator          27            Southeast
                                                          Undergraduate
                                                          Public; 4-year degree;
Female Vice Provost         25            South
                                                          Undergraduate & graduate
                                                          Private; 4-year degree;
Male     Director           29            Midwest
                                                          Undergraduate
         Chief Academic                                   Public; 2-year degree;
Male                        20            South
         Officer                                          Undergraduate
                                                          Private; 4-year degree;
Male     Director           19            Southeast
                                                          Undergraduate & graduate
                                                          Public; 4-year degree;
Female Director             14            Mid-Atlantic
                                                          Undergraduate & graduate
                                             653
                Foundations of Learning and Instructional Design Technology
Three Delphi data collection phases were completed during this study. During the first
round, panelists responded to the following three open-ended questions:
The purpose of these questions was to delineate specific multimedia production knowledge
and skills, required of these professionals. The questions were open-ended in order to avoid
biasing our panelists’ responses (Linstone & Turoff, 1975). The panelists responded to these
questions via email.
With the intent of identifying emerging and reoccurring themes, three evaluators analyzed
the panelists’ responses using a category construction data analysis method as outlined by
Merriam (2009). Questionable items and themes were discussed among the three
evaluators; the evaluators reached consensus on all items. Particular themes from these
responses were identified. This initial set of themes was sent to the panelists for their
review. Each panelist had the opportunity to respond to the overarching group of themes
and the specific themes, and to add additional categories as well. All of these themes were
compiled into a summative questionnaire, and this questionnaire was then distributed
during the second round.
The intent of the questionnaire was to establish a quantitative appraisal of our panelists’
responses about each item and to seek a common set of responses to Instructional Design
and Technology graduates’ multimedia production knowledge and skills. The panelists rated
each questionnaire item with regard to the importance of each identified knowledge or skill,
and the panelists’ responses were compiled and distributed via email to each panel member.
Panelists were then given the opportunity to offer feedback about the questionnaire results
and make any corrections, as necessary.
During the third round, the eleven panelists reviewed the Round #2 ratings and were given
the opportunity to revise their own ratings. Five of the eleven panelists recommended minor
incremental changes to their original rankings. None of the eleven panelists made any
suggestions to either add another item or remove an existing item. Given this feedback, we
determined that these minor modifications indicated there was an apparent consensus
                                             654
               Foundations of Learning and Instructional Design Technology
Results
During the initial Delphi phase, the eleven panelists generated 289 unique statements
regarding the three aforementioned initial questions. From this first round of responses, 60
distinct multimedia knowledge and skills needed by Instructional Design and Technology
graduates were identified and organized into seven primary categories. This list of
categories was then sent back to our panelists for confirmation. Eight of the eleven panelists
recommended ten additional knowledge and skills for a total of 70 items.
Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0=
somewhat important, 1= important, 2= essential.
                                             655
               Foundations of Learning and Instructional Design Technology
Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0=
somewhat important, 1= important, 2= essential.
The panelists also reacted to the seven categories. Four original categories (Visual and
Graphic Design, Instructional Design and Pedagogy, Communication and Collaboration, and
Delivery and Project Management) did not receive any feedback or edits and were approved.
The panelists commented on the three original categories: Basic Production, Specific
Software Tool and Online. Upon review of these comments, these categories were renamed
Production, Applications, and Online Applications respectively. We distinguished between
applications (e.g., Flash) that can create instruction for online settings as well as non-online
settings, and applications (e.g., Dreamweaver) that exclusively create instruction for online
settings.
In summary, Delphi panelists’ responses were organized into seven categories: Production
(10 items), Applications (12 items), Online Applications (15 items), Visual and Graphic
Design (6 items), Instructional Design and Pedagogy (15 items), Communication and
Collaboration (4 items), and Delivery and Project Management (8 items). See Appendix for a
listing of these categories and corresponding items.
During the next Delphi phase, our eleven panelists ranked these seventy items on the
following scale: Essential, Important, Somewhat important, Not important, Unnecessary.
Accordingly, we assigned a 2 to -2 Likert scale for these five items where Essential items
received 2 points, Important items received 1 point, Somewhat important items received 0
points, Not important items received -1 point, and Unnecessary items received -2 points.
Thus, the top score any item could receive would be 22 points (i.e., all 11 panelists deemed
this item to be Essential) and the lowest score that an item could receive would be -22
points (i.e., all 11 panelists deemed this item to be Unnecessary). This rating system also
provides the ability to weight and counterweight individual panelists’ responses about a
particular item. For example, if a panelist rated one item as Important (1 point) and another
panelist rated the same item as Not important (-1 point), the item would receive a combined
score of zero points and would be considered as Somewhat important.
The average scores for all of the seventy items ranged from M = 1.91 to M = -.4 (see
                                              656
                Foundations of Learning and Instructional Design Technology
Appendix). The 15 top-ranked items that received a 1.45 average or higher are found in
Table 2. The top two items, Communication (M = 1.91, SD = .30) and Social skills (M= 1.73,
SD = .65) were within the Communication and Collaboration category. Three production
items, Web Design Basics (M = 1.64, SD = .51), Video Production (M = 1.45, SD = .52), and
Screencasting (M = 1.45, SD = .69) were including in this top-ranked list. The item, Visual
communication and visualization theories (M = 1.60, SD = .70), was the fourth highest-
ranked item and Microsoft Office Suite (M = 1.55, SD = .52) was the fifth highest-ranked
item. Four of the fifteen Instructional Design and Pedagogy items and three of the eight
Delivery and Project Management items also were distributed in this top-ranked listing.
Learning Content Management Systems (LCMS) (M = 1.45, SD = 1.21) also was in this top
ranking list. The eleven bottom-ranked items that received a .36 average or lower are found
in Table 3. Five Online applications (XML, Online quiz tools, Online plug-ins, Contribute, and
Google Forms/Survey Monkey) were located in this list of items. Three Production items
(Photography, Animation, and Programming) and three Applications items (Garageband,
Final Cut Pro, and Green screen) received an average of 0 or lower.
In Table 4, the percentage of importance ratings is listed for each category. Over sixty
percent of the items (63.8%) from each of the seven categories received an “Important” (M
> 1) to “Essential” (M < 2) ranking. All theVisual and Graphic Design (n=6) items were
within this range. Fourteen of the fifteen Instructional Design and Pedagogy items received
“Important to Essential” ratings; SCORM received an average score lower than 1 (M = .73,
SD = .91). Three of the four Communication and Collaboration items also received
“Important to Essential” ratings. Public presentation skills received an average score lower
than 1 (M = .91, SD = .94). All but one Delivery and Project Management item (n=7) also
                                             657
                 Foundations of Learning and Instructional Design Technology
Sixty percent of the Production items (n=6) received an “Important” (M > 1) to “Essential”
(M < 2) rating (see Table 4). A majority of the Delphi panelists categorized Web design
basics (M = 1.64, SD = .51), Video production (M = 1.45, SD = .52), Screencasting (M =
1.45, SD = .69), Audio production (M = 1.36, SD = .67), Images production (M = 1.36, SD =
.67), and Basic HTML commands (M = 1.09, SD = 1.10), as “Important” to “Essential”
items. (see Table 5). The remaining four Production items either received a “Somewhat
important” (M < 0) to “Important” (M < 1) ranking (i.e., Desktop publishing and
Photography) or received a “Not important” (M < -1) to “Somewhat important” (M < 0)
ranking (i.e., Animation and Programming skills).
Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0=
somewhat important, 1= important, 2= essential.
                                            658
               Foundations of Learning and Instructional Design Technology
Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0=
somewhat important, 1= important, 2= essential.
Only 25% of the Application items (n=3) received an “Important” (M > 1) to “Essential” (M
< 2) rating (see Table 6). Two of these three applications are generic applications with
regard to multimedia production items. These applications are Microsoft Office suite (M =
1.55, SD = .52) and Major operating systems (M = 1.00, SD = 1.08). The other Application
item is the overall Adobe software suite (M = 1.09, SD = .94). The remaining nine
Application items either received a “Somewhat important” (M < 0) to “Important” (M < 1)
ranking (i.e., Audacity, Flash, Photoshop, Acrobat, iMovie, Fireworks, and Garageband) or
received a “Not important” (M < -1) to “Somewhat important” (M < 0) ranking (i.e., Final
Cut Pro and Green screen).
                                             659
               Foundations of Learning and Instructional Design Technology
Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0=
somewhat important, 1= important, 2= essential.
Similar to the Application items, there is disagreement among the panelists regarding the
importance of particular online applications. As shown in Figure 2, at least 45% of the
panelists perceived the importance of the following two applications: Camtasia and Online
plugins. Six panelists perceived Camtasia as either an Important or an Essential multimedia
production item whereas five panelists perceived Camtasia as either Somewhat important or
Not important. Five panelists perceived Online plugins as either an Important or an
Essential multimedia production item whereas six panelists perceived these tools as either
Somewhat important, Not important or Unnecessary.
Discussion
In considering these results, the Delphi panelists identified specific multimedia production
skills and knowledge needed by entry-level Instructional Design and Technology (ID&T)
professionals who work in higher education settings. These skills and knowledge include the
following: generalized multimedia production knowledge and skills, emphasis of online
learning skills, and the interrelationship between multimedia production and instructional
design skills. After describing these skills and knowledge, we discuss how these results have
influenced our own respective curricular practices, as well as anticipate future research
studies that would provide additional understanding on how best to educate instructional
designers working in higher education settings.
The Delphi panelists undoubtedly came to consensus that ID&T graduates need to be well-
versed with a number of general multimedia production skills. Visual design principles,
video production and audio production skills all were ranked high and were considered
Essential by a majority of the panelists. Conversely, more advanced and specialized
technologies (e.g., programming and green screen technology) are not as important and
were ranked as Unnecessary. Also, there is a conclusive preference among the panelists
regarding online learning applications and skills. Web design basics, online course
pedagogy, screen-casting, and LCMS skills all were ranked as Essential. It is interesting to
                                             660
               Foundations of Learning and Instructional Design Technology
In addition to these essential multimedia production skills, the panelists’ rankings indicate
an inter-relationship between instructional design skills and multimedia production skills.
Even though panelists were asked about ID&T graduates’ multimedia production knowledge
and skills, eighty percent of the items from the Instructional design and pedagogy category
(e.g., Knowledge of learner characteristics, Determining the appropriate delivery venue for
particular content area, etc.) were ranked as Essential. Furthermore, Communication skills
and Social skills were ranked first and second, respectively. This finding implies that ID&T
entry-level professionals need a robust combination of general multimedia production skills
and knowledge and overall instructional design skills and knowledge.
Implications
As Instructional Design and Technology faculty members, we were intrigued to receive
these results from our panelists and are now considering curricular revisions for our
respective courses. The results from our study indicate that multimedia production items
cannot be taught in isolation and should not be linked to a particular software application.
In previous semesters, our respective multimedia production courses were the default
software application course (e.g., Flash, Authorware, Director, etc.). Currently, our students
now use “lowest common denominator,” computer-based instruction applications (e.g.,
PowerPoint) to teach particular computer-based instruction methodologies (e.g., tutorial).
Our respective students are introduced to innovative technologies (e.g., Prezi), but the
emphasis is not solely on the particular authoring tool, but on how to integrate this tool into
overall, existing instructional modules. To highlight the interrelationship between
multimedia production and instructional design skills, our students are now required to
complete instructional design reports when creating a multimedia production project. We
view these projects as instructional design “experiments” and students complete “lab
reports” with each project.
The panelists’ respective rankings and results also indicate additional areas to explore with
regards to ID&T graduates’ overall multimedia production and instructional design skills
and knowledge. Inquiry into the changing role of the instructional designer with respect to
these two skill sets, such as Schwier and Wilson’s (2010) recent study should take place. A
more in-depth understanding of what Willis (2009) refers to as process instructional design,
such as a study on the best practices involving collaboration between instructional designer
and client is encouraged as well. In addition, case studies on how instructional designers
effectively balance multimedia production and instructional design skills should be
developed. These case studies could be used as instructional tools to teach novice
instructional designers best practices in integrating multimedia production skills within an
overall instructional design project.
                                             661
               Foundations of Learning and Instructional Design Technology
In summary, the results from this Delphi study indicate that Instructional Design and
Technology professionals working in higher education settings need to be educated about
overall multimedia production skills and how these skills interrelate to their set of
instructional design skills. As Instructional Design and Technology educators, we look
forward to considering innovative and effective approaches to our respective curricula and
to continuing this dialogue with other Instructional Design and Technology educators.
Application Exercises
References
Al-Qirim, N. (2011). Determinants of Interactive white board success in teaching in higher
education institutions. Computers & Education, 56(3), 827–838.
Archambault, L., Wetzel, K., Foulger, T. S., & Williams, M. (2010). Professional development
2.0: Transforming teacher education pedagogy with 21st century tools. Journal of Digital
Learning in Teacher Education, 27(1), 4–11.
Barczyk, C., Buckenmeyer, J., & Feldman, L. (2010). Mentoring professors: A model for
developing quality online instructors and courses in higher education. International Journal
on E-Learning, 9(1), 7–26.
Bluteau, P., & Krumins, M. (2008). Engaging academics in developing excellence: Releasing
creativity through reward and recognition. Journal of Further & Higher Education, 32(4),
415–426.
Brill, J. M., Bishop, M. J., & Walker, A. E. (2006). The competencies and characteristics
required of an effective project manager: A web-based Delphi study. Educational Technology
Research & Development, 54(2), 115–140.
                                             662
               Foundations of Learning and Instructional Design Technology
Clark, K. (2006). Practices for the use of technology in high schools: A Delphi study. Journal
of Technology & Teacher Education, 14(3), 481–499.
Conole, G., Galley, R., & Culver, J. (2011). Frameworks for understanding the nature of
interactions, networking, and community in a social networking site for academic practice.
International Review of Research in Open & Distance Learning, 12(3), 119–138.
Darwin, A., & Palmer, E. (2009). Mentoring circles in higher education. Higher Education
Research & Development, 28(2), 125–136.
Delacruz, E. (2009). Old world teaching meets the new digital cultural creatives.
International Journal of Art & Design Education, 28(3), 261–268.
Dinsmore, D., Alexander, P., & Loughlin, S. (2008). The impact of new learning
environments in an engineering design course. Instructional Science, 36(5/6), 375–393.
Donato, E., Hudyma, S., Carter, L., & Schroeder, C. (2010). The evolution of WebCT in a
baccalaureate nursing program: An Alice in Wonderland reflection. Journal of Distance
Education, 24(3), Retrieved from http://www.jofde.ca/index.php/jde/article/view/702/1163.
El-Hussein, M., & Cronje, J. C. (2010). Defining mobile learning in the higher education
landscape. Journal of Educational Technology & Society, 13(3), 12–21.
Kear, K., Woodthorpe, J., Robertson, S., & Hutchison, M. (2010). From forums to Wikis:
Perspectives on tools for collaboration. Internet and Higher Education, 13(4), 218–225.
Kirschner, P., Carr, C., van Merrienboer, J., & Sloep, P. (2002). How expert designers
design. Performance Improvement Quarterly, 15(4), 86–104.
Larson, M., & Lockee, B. (2009). Preparing instructional designers for different career
environments: A case study. Educational Technology Research & Development, 57(1), 1–24.
Linstone, H. A., & Turoff, M. (Eds.). (1975). The Delphi method: Techniques and
applications. Reading, MA: Addison-Wesley.
                                             663
               Foundations of Learning and Instructional Design Technology
Linstone, H. A., & Turoff, M. (Eds.). (2002). The Delphi method: Techniques and
applications. Retrieved from https://edtechbooks.org/-nQ
MacKeogh, K., & Fox, S. (2009). Strategies for embedding e-learning in traditional
universities: Drivers and barriers. Electronic Journal of e-Learning, 7(2), 147–153.
Nicolle, P. S., & Lou, Y. (2008). Technology adoption into teaching and learning by
mainstream university faculty: A mixed methodology study revealing the “How, When, Why,
and Why not”. Journal of Educational Computing Research, 39(3), 235–265.
Pollard, C., & Pollard, R. (2004). Research Priorities in Educational Technology: A Delphi
Study. Journal of Research on Technology in Education, 37(2), 145–160.
Renes, S., & Strange, A. (2011). Using technology to enhance higher education. Innovative
Higher Education, 36(3), 203–213.
Rice, K. (2009). Priorities in K-12 distance education: A Delphi Study examining multiple
perspectives on policy, practice, and research. Journal of Educational Technology & Society,
12(3), 163–177.
Samarawickrema, G., Benson, R., & Brack, C. (2010). Different spaces: Staff development
for Web 2.0. Australasian Journal of Educational Technology, 26(1), 44–49.
Schwier, R. A., & Wilson, J. R. (2010). Unconventional roles and activities identified by
instructional designers. Contemporary Educational Technology, 1(2), 134–147.
Scott, D. G., Washer, B. A., & Wright, M. D. (2006). A Delphi study to identify recommended
Biotechnology competencies for first-year/initially certified technology education teachers.
Journal of Technology Education, 17(2), 44–56.
Shibley, I., Amaral, K. E., Shank, J. D., & Shibley, L. R. (2011). Designing a blended course:
Using ADDIE to guide instructional design. Journal of College Science Teaching, 40(6),
80–85.
Slagter van Tryon, P. J., & Bishop, M. J. (2009). Theoretical foundations for enhancing social
connectedness in online learning environments. Distance Education, 30(3), 291–315.
Smith, D. W., Kelley, P., Maushak, N. J., Griffin-Shirley, N., & Lan, W. Y. (2009). Assistive
technology competencies for teachers of students with visual impairments. Journal of Visual
Impairment & Blindness, 103(8), 457–469.
So, H., & Bonk, C. J. (2010). Examining the roles of blended learning approaches in
Computer-Supported Collaborative Learning (CSCL) environments: A Delphi study. Journal
of Educational Technology & Society, 13(3), 189–200.
Stav, J., Nielsen, K., Hansen-Nygård, G., & Thorseth, T. (2010). Experiences obtained with
                                             664
               Foundations of Learning and Instructional Design Technology
integration of student response systems for iPod Touch and iPhone into e-Learning
environments. Electronic Journal of e-Learning, 8(2), 179–190.
Steel, C. (2009). Reconciling university teacher beliefsto create learning designs for LMS
environments. Australasian Journal of Educational Technology, 25(3), 399–420.
Sugar, W., Brown, A., & Daniels, L. (2009). Identifying entry-level multimedia production
competencies and skills of instructional design and technology professionals: Results from
the 2009–2010 biennial survey. Presented at the annual conference of the Association for
Educational Communications and Technology (AECT), Louisville, Kentucky.
Sugar, B., Hoard, S. B., Brown, A., & Daniels, L. (2011). Identifying multimedia production
competencies and skills of Instructional Design and Technology professionals: Results from
recent job postings. Presented at the annual conference of the Association for Educational
Communications and Technology (AECT), Jacksonville, Florida.
Suggested Citation
    Sugar, W., Brown, A., Daniels, L., & Hoard, B. (2018). What Are the Skills of an
    Instructional Designer?. In R. E. West, Foundations of Learning and Instructional
    Design Technology: The Past, Present, and Future of Learning and Instructional
    Design Technology. EdTech Books. Retrieved from
    https://edtechbooks.org/lidtfoundations/skills_instructional_designer
                                            665
       Foundations of Learning and Instructional Design Technology
                                   666
                             William Sugar
                                       667
                             Abbie Brown
                                      668
                              Lee Daniels
                                      669
                             Brent Hoard
                                      670
                 Final Reading Assignment
Now that you are concluding this book, you should know . . . that you still know very little
about the field of Learning and Instructional Design Technology. This is a “meta” field after
all! One of the best pieces of advice I received as a student was to “read everything.” As you
progress in your studies, you will need to focus your reading specifically on the body of
literature influencing your own work. However, take time to also read broadly, because
often we need to step outside of our narrow research and design agendas to spark our
creativity. The following are some recommended readings for you (with link addresses
provided for those reading this book in pdf form), although any person in the field will have
a different list—so ask them what they have read that influenced them, and you will be led
down a fruitful path.
Open-access
                                             671
Foundations of Learning and Instructional Design Technology
                           672
             Book Author Information
Richard E. West
He tweets @richardewest, and his research can be found on Google Scholar and
his website: http://richardewest.com.
                                       673
West, R. E. (2018). Foundations of Learning and Instructional Design
Technology: The Past, Present, and Future of Learning and Instructional Design
Technology (1st ed.). EdTech Books. Retrieved from
https://edtechbooks.org/lidtfoundations
         CC BY: This book is released under a CC BY license, which means that you
         are free to do with it as you please as long as you properly attribute it.