Communication Education: To Cite This Article: Caleb T. Carr, Paul Zube, Eric Dickens, Carolyn A. Hayter & Justin
Communication Education: To Cite This Article: Caleb T. Carr, Paul Zube, Eric Dickens, Carolyn A. Hayter & Justin
222]
On: 07 March 2015, At: 16:43
Publisher: Routledge
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK
Communication Education
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/rced20
To cite this article: Caleb T. Carr , Paul Zube , Eric Dickens , Carolyn A. Hayter & Justin
A. Barterian (2013) Toward A Model of Sources of Influence in Online Education: Cognitive
Learning and the Effects of Web 2.0, Communication Education, 62:1, 61-85, DOI:
10.1080/03634523.2012.724535
Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever
or howsoever caused arising directly or indirectly in connection with, in relation to or
arising out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Communication Education
Vol. 62, No. 1, January 2013, pp. 6185
To explore the integration of education processes into social media, we tested an initial
model of student learning via interactive web tools and theorized three sources of
influence: interpersonal, intrapersonal, and masspersonal. Three-hundred thirty-seven
students observed an online lecture and then completed a series of scales. Structural
equation modeling supported several individual hypotheses and partially supported the
overall model. Findings indicated that instructor credibility has a significant positive
effect on content area knowledge, whereas social identification with online colearners
has a negative effect on learning outcomes. Findings are discussed with respect to both
theoretical and practical implications of the integration of interactive media as a
classroom resource.
Caleb T. Carr (Ph.D., Michigan State University) is an Assistant Professor at Illinois State University. Paul Zube
(M.A., State University of New York*Albany) is a Visiting Assistant Professor at Ferris State University. Eric
Dickens (M.A., Michigan State University) is a Visiting Assistant Professor at Monmouth College. Carolyn
A. Hayter (M.A., Michigan State University) and Justin A. Barterian (M.A., Michigan State University) are
Ph.D. students in the Department of Counseling, Educational Psychology, & Special Education at Michigan
State University. An earlier version of this study was presented at the 2012 annual meeting of the National
Communication Association in Orlando, FL. The authors are grateful to two anonymous reviewers and the
editor for their assistance refining this manuscript. We also thank Glenn Hansen for his statistical analysis
assistance as well as John Banas and his graduate teaching assistants for aiding in data collection. Caleb T. Carr
can be contacted at ctcarr@ilstu.edu
formal classroom context. Faculty, student groups, and scholars can post content on
sites such as YouTube from which any user can draw knowledge (Young, 2008). The
pace with which educators are adapting and integrating new technologies into and
beyond classrooms elicits the question of how these technologies are affecting the
learning process.
We based this study on the need to better understand communication and
education in interactive online settings. Given the complex interactions, multilevel
processes, and evolving technologies involved in education, a unifying theory of
online learning was difficult to articulate. However, guided by the assumption that
many learning processes are fundamental regardless of medium (Clark, 1994), our
model drew heavily from earlier educational models, replicating previous research
to extend findings to a novel educational context. Moreover, the model considered
several factors unique to emergent media*including the influence of anonymous
third parties*on both learning and its antecedents. We developed and tested this
model not to explain the online educational process holistically, but rather as a
starting point from which future scholarship may draw to conceptualize and explain
the process of online education as it shifts from the brick-and-mortar classroom to
the click-and-mortar interactive web classroom.
traditional classrooms (Dutton, Cheong, & Park, 2004; Harrington, Gordon, &
Schibik, 2004), and sometimes supplement in-class interactions with online
discussion boards. Initial studies of mediated education concluded that media’s
increasing penetration into educational contexts would not change the process of
educating nor the educational experience for students (e.g., Clark, 1994). As such,
most Web 1.0 research focused solely on how to adapt extant educational practices
for online delivery.
Yet, to quote Harasim (2000, p. 41), ‘‘Shift happens.’’ Advances in Internet access
and online interaction have given rise to the more social context of Web 2.0 (O’Reilly,
2005), opening new possibilities for changes in the way online content is delivered by
instructors and interpreted by students. Although precise definitions and opera-
tionalization of Web 2.0 are difficult to identify (Burke, 2009), Web 2.0 tools are
typically conceptualized as those exploiting the social, interactive, and participatory
nature of Web 2.0 tools (Walther et al., 2011). These social web tools allow users,
rather than programmers alone, to create the content of the website through
interaction. A current exemplar of the social web is Facebook*a service that, if
Downloaded by [] at 16:43 07 March 2015
Cognitive Learning
Education researchers have identified several types of learning, including affective,
Downloaded by [] at 16:43 07 March 2015
behavioral, and cognitive learning (e.g., Bloom, 1956; Gagné, 1972; Krathwohl, 2002).
Affective learning focuses on students’ perceptions toward teacher communication
and course content (Gagné, 1972; Pogue & Ahyun, 2006), and behavioral learning
focuses on behavioral modification such as adolescents’ social skills or operant
conditioning in animals (Shuell, 1986). Though both affective and behavioral learn-
ing outcomes are important in educational practices, the present research focuses
on cognitive learning. Bloom (1956) defined cognitive learning as the ‘‘recall or
recognition of knowledge and the development of intellectual abilities and skills’’
(p. 7). Practically, cognitive learning has been conceptualized as the ability of an
individual to retain and understand information (Edwards, Edwards, Shaver, &
Oaks, 2009). Given the prominence of knowledge retention and standardized test-
ing as learning outcomes (Dumont, 1996; Oner, 1995), recall is typically used as an
operationalization of the construct of cognitive learning, and it was the focus of
the present work. Decades of research have focused on cognitive learning outcomes,
with recent lines of research and academic journals focusing on cognitive learning
as it migrates to the Internet (e.g., Rovai, Wighting, Baker, & Grooms, 2009; Yang,
Richardson, French, & Lehman, 2011). Yet, research into learning processes within
the emergent social web requires a reexamination of cognitive learning, as traditional
sources of influence (i.e., teachers and students themselves) may alter communica-
tive and learning processes by altering interactions with instructors and students
experiences and providing additional sources of influence such as anonymous
others.
those findings have yet to be confirmed in online, social environments. The purpose
of this study is to retest some of those previously established relationships and to
examine a conceptual model of learning in the Web 2.0 environment. We proffer
cognitive learning via Web 2.0 is broadly influenced by three sources of influence:
interpersonal, intrapersonal, and masspersonal communication. Given the different
communicative processes involved in these three sources, it may be inappropriate for
a single theory to unify and articulate these influences. Consequently, in the following
sections we address these sources, briefly establish their proposed influence on
students’ cognitive learning (guided by the construct of instructor credibility, SIDE
model, and the masspersonal perspective, respectively), and derive hypotheses.
Others’ social identities. Offline, the influence of individuals other than teachers in
classroom settings has been empirically supported, usually referring to classmates.
For example, network analyses of traditional classrooms have demonstrated that
gains in classroom performance by semester’s end are higher in classrooms with
Testing a Model of Online Learning 67
argued that when individuals interact with visual anonymity (Lea, Spears, & deGroot,
2001; Lee, 2006) as is often the case online, they identify and relate based on their
social characteristics and traits rather than using idiosyncratic similarities with
particular individuals (Postmes, Spears, Sakhel, & deGroot, 2001; Reicher et al., 1995).
Depersonalized individuals relate based on ingroup/outgroup dynamics (Douglas &
McGarty, 2001), which in turn leads to greater valuation of the social ingroup’s
attitudes and norms, in turn leading to potentially greater social influence by group
members. In many Web 2.0 services (e.g., YouTube) users post pseudonymous
messages often indicating social self-categorization or personality traits without
providing personally identifying or individuating information, similar to instant
message and personal email user names (Bechar-Israeli, 1995; Heisler & Crabill, 2006).
In the absence of personal cues to distinguish others online, students are forced to rely
on others’ social identities to guide perceptions and affect influence.
Social identification positively influences an individual’s perception of an instructor
and engagement in traditional classroom settings (Edwards et al., 2007, 2009). First,
greater identification with colearners’ social identity leads to a student’s perception of
classroom community and control, which is likely to increase perceptions of instructor
credibility Thus, we posited the following hypothesis:
H4: Increased social identification with online commenters of an online lecture leads
students to increased perceived instructor credibility.
Second, social identification with colearners has been shown to increase perceptions
of the classroom as a community (Mazer, Murphy, & Simonds, 2007). As students
identify with peers, they feel more engaged in the learning process and consequently
demonstrate greater learning affect. Therefore, students feeling more a part of the
classroom ingroup are expected to feel greater engagement in the online course.
H5: Increased social identification with online commenters of an online lecture leads
students to increased educational affect.
68 C. T. Carr et al.
Social identification with online others is also expected to facilitate cognitive
learning. In a comparison of staff versus peer tutoring programs, Moust and Schmidt
(1994) detected differences in students’ social identification with tutors, finding that
peer tutors were able to relate more readily to students’ experiences and educational
challenges. Differences in course and content engagement were attributed, in part,
to student’s identification with peer tutors’ experiences and interpretation of course
content. As students demonstrate greater social identification in social learning
environments (i.e., perceive themselves as part of the classroom ingroup), they will
likely be guided by normative behavior to engage in class content, increasing learning
outcomes (De Simone, Oka, & Tischer, 1999; Paulus, Horvitz, & Shi, 2006). Because
students perceive themselves as part of the Web 2.0 classroom, they may be more
involved and therefore demonstrate more cognitive learning in an online session.
H6: Increased social identification with online commenters of an online lecture leads
students to increased cognitive learning.
others online may influence online learners via the comments they contribute to the
course discussion. The ability to leave feedback in the form of comments that are
copresent with the original media content is unique to social media tools. The nature
of comments posted online is as varied as their foci and includes user-generated
statements regarding a target individual (Walther, Van Der Heide, Westerman, &
Tong, 2008), system-generated amalgamations of user feedback such as rating systems
(Lampe, 2006), and metareviews of reviewers (Resnick, Kuwabara, Zeckhauser, &
Friedman, 2000). In this study, we focused on comments that address the quality of
and engagement with media content, such as comments posted to a YouTube video.
A cursory interaction with YouTube.com quickly illustrates that comments are
common in occurrence and broad in quality and topicality within the web tool.
The positivity of an online comment may be expected to influence a student’s
attitudes toward course content. Although user-generated comments on message
content can reflect interactive exchanges (i.e., discussion between commenters), often
comments reflect reactive exchanges whereby individuals respond to the central
message of the online content or video clip. Walther and colleagues (Walther, Carr,
et al., 2010; Walther, DeAndrea, Kim, & Anthony, 2010) have suggested that these
reactive exchanges shape the way individuals exposed to the central message and
resultant comments interpret and perceive the content. Indeed, the positivity of
comments in social media can influence others’ attitudes (Lee & Youn, 2009; Walther,
Carr, et al., 2010). Consequently, we predict the valence of a comment about an
online lesson can influence students’ perceptions of that lesson, so that a positive
comment (e.g., ‘‘This is a great lecture’’) may result in students’ feeling more positive
toward the content, while a negative comment (e.g., ‘‘This is a terrible lecture’’) may
result in students feeling less positive toward the content.
H7: The positivity of a comment addressing the content of an online lecture is
positively related to the educational affect of students observing the comment.
Testing a Model of Online Learning 69
Method
Participants
Three-hundred thirty-seven participants were drawn from several sections of a
single undergraduate communication survey course at a large Midwestern univer-
sity. Participants took part in this research as part of regular classroom activity,
70 C. T. Carr et al.
Stimulus Materials
Video lecture. A video lecture was created with the aid of a department faculty
member familiar with the course content. Participants confirmed that they had had
no previous contact with the video instructor by negative responses to the item,
‘‘Have you ever met or had a class with the instructor in the video?’’ The video
instructor, a casually dressed, middle-aged white male seated in a faculty office in
front of a shelf of textbooks, presented a simulated 8-minute lecture addressing
elements of nonverbal communication modeled to be delivered as a typical course
lecture. This video lecture content was held constant across conditions and uploaded
to YouTube, a popular Web 2.0 site. The experimental manipulations consisted of
simulated user comments (see next section) that appeared under the video. These
pages were then copied to a local server, and the underlying source code was modified
so that although the hyperlinks were visible, they did not actually navigate the user
to other sites. The video was active and began playing automatically. In all other
respects, the stimuli appeared to be actual YouTube content, although it is not certain
that all subjects believed they were actually interacting with YouTube.com.
User comments. Video comments were adopted from stimuli used by Edwards et al.
(2009), in their test of the effect of instructor-related comments designed to produce
low and high expectations of learning in a course. In positively valenced conditions,
the lecture video was followed by two comments: ‘‘You’ll learn a lot about nonverbal
communication from this video. It gives great tips. Watching it has helped me make
sense of a lot of the readings,’’ and, ‘‘It was easy to learn so much from this lecture.
Testing a Model of Online Learning 71
I still remember everything that was covered. You can imagine how easy it was to pass
the next exam.’’ In negatively valenced conditions, the lecture video was followed by
two comments: ‘‘You’ll learn nothing about nonverbal communication from this
video. It gives worthless tips. Watching it hasn’t helped me make sense of a lot of
the readings,’’ and, ‘‘It wasn’t easy to learn much from this lecture. I can’t remember
a single thing that was covered. You can imagine how hard it was to pass the next
exam.’’
Across conditions, the two messages were posted by pseudonymous users of like
social identification: Commenters were either both ingroup members or both
outgroup members.
Procedures
Course instructors informed all enrolled students via email and in-class announce-
ments that, in anticipation of the difficulty of upcoming class content, a video had
been made available that students could use to preview course content before reading
or discussing it in class. Moreover, students were informed they would take a brief
questionnaire and practice exam following the video lecture to prepare them for an
upcoming exam. An email including the announcement and link to the study was
sent to participants at least one week before course content was covered in class.
Upon clicking the link, participants were directed to a website that randomly
redirected participants’ browsers to one of four conditions. In each condition, parti-
cipants first watched a video lecture on nonverbal communication. The video was
uniform across conditions, although the commenter and valence of the comments
posted below the video differed based on the experimental treatment condition. After
watching the video, participants were directed to a survey instrument which included
measures and demographic information. After completing the survey measures,
participants completed a 10-item multiple choice quiz developed by course staff to
evaluate their retention of course content presented in the video lecture. After com-
pleting the quiz, all participants were automatically redirected to a single site which
provided quiz answers and collected identifying information to award class credit.
Measures
Several items were included in the survey instrument to assess the variables of interest
in this research. Social identification was assessed using a previously validated 5-item
72 C. T. Carr et al.
social identification scale (Wang, 2007; Wang, Walther, & Hancock, 2009). Using
7-point Likert-type items with endpoints of 1 (strongly disagree) and 7 (strongly
agree), participants indicated their agreement with statements including, ‘‘The com-
menters belong to a similar social group as me,’’ and, ‘‘The commenters and I are
part of the same social group.’’ The mean of item responses was used to assess social
identification with comment posters, with higher means indicating greater social
identification. The scale demonstrated strong reliability, Cronbach’s a.82.
Affect was measured using a 4-item subset of McCroskey’s (1994) Affective
Learning Scale focusing on affect toward content. Items asked respondents to respond
to their perceptions of class content using 7-point semantic differential items with
anchor points including, Bad/Good, Valuable/Worthless, Unfair/Fair, and Positive/
Negative. The mean of item responses was used to assess affect toward content
presented in the video, with higher means indicating greater educational affect. The
scale demonstrated strong reliability, a.88.
Recalling the importance of standardized testing in education, often operationa-
lized as knowledge retention (Dumont, 1996; Oner, 1995), and following previous
Downloaded by [] at 16:43 07 March 2015
research (Edwards et al., 2009; Frymier & Houser, 1999), cognitive learning outcomes
were measured using a 10-item multiple choice quiz. Specifically, items assessed the
knowledge domain of Bloom’s (1956) taxonomy, asking participants to recall specific
terminology and statistics presented in the video lecture using multiple-choice and
truefalse items for which there was only one right answer. The sum of correct
responses was used as an interval-level indicator of knowledge recall that could range
from zero to ten, with higher scores indicating greater cognitive learning. Participants
averaged 7.89 (SD 1.93) correct responses. A postlecture test allowed between-
subjects tests for differences in the dependent measure based on exposure to different
experimental stimuli while controlling for prior knowledge through random assign-
ment to conditions (Keppel & Wickens, 2004), and allowing attribution of observed
differences to experimental treatments.
The 10-item cognitive learning measure demonstrated moderate but acceptable
reliability (Kuder-Richardson 20 .63) according to guidelines from Subkoviak
(1988). However, the KR-20 should be evaluated cautiously, as it is sensitive to
several factors such as the grouping of questions or the difficulty of the test (Kuder
& Richardson, 1937). Consequently, additional standards were utilized to assess the
reliability and validity of the dependent measure. First the KR-20 value, though
lower than traditional reported reliability measures, is commensurate with previous
research utilizing novel post-test measures of recall such as subject-material tests
(e.g., Chamorro-Premuzic & Furnham, 2003; De Grez, Valcke, & Roozen, 2009;
Ellis, 2000). Second, Saupe (1961) noted that most common classroom tests, such
as the one used in this experiment, have a mean score of around 70 percent of
questions answered correctly. The present results map well onto this heuristic, with
79% of questions answered correctly (SD19%), suggesting an externally valid
cognitive learning measure. Finally, the results of the postlecture test were compared
against the final course grades in course sections from which participants were
Testing a Model of Online Learning 73
Data Analysis
To test hypothesized relationships, we used LISREL 8.80 to conduct structural
equation modeling (SEM) with maximum likelihood estimates. Before conducting
the analysis, data were examined to assess the validity of statistical assumptions
for SEM. Measurement error variance was addressed by first examining data for
outliers. Six participants were identified as providing outlying data assessed by the
Mahalanobis’ D2, and their responses were excluded from analysis, leaving n331.
Next, a chi-square test (328.866, p B.001) revealed multivariate skewed and kurtosis
data, and therefore nonnormality. As maximum likelihood estimation assumes
normal distribution, data were normalized (du Toit, du Toit, Mels, & Cheng, 2007)
to meet statistical assumptions.
Finally, the hypothesized relationships were estimated using a combination of
a latent composite and a hybrid model (cf. Stephenson & Holbert, 2003). As cogni-
tive learning was modeled as an endogenous (i.e., influenced by other variables in
the model) latent variable, its error term was specified by first fixing the path from
the latent construct to its observed variable (i.e., exam score) to 1.0, and then error
variance of the observed index was fixed to [(1 reliability)variance of 1.38] to
‘‘reflect the proportion of variance in the index attributable to measurement error’’
(Stephenson & Holbert, 2003, p. 335). Instructor credibility, educational affect, and
social identification with poster were modeled with the hybrid approach, specifying
relationships between the scale items and their respective latent concepts (Table 1).
Combining these latent composite and hybrid models allowed evaluation of the
hypotheses as well as the proposed model, while accounting for measurement error
and variance of the constructs.
74 C. T. Carr et al.
Table 1 Nonstandardized Coefficients of Latent Variable Indicators in Structural
Equation Model
Instructor Educational Social Identification Cognitive
Indicator item Credibility Affect with Commenter Learning
Credibility 1 .49
Credibility 2 .51
Credibility 3 .58
Credibility 4 .21
Credibility 5 .41
Affect 1 1.07
Affect 2 .77
Affect 3 .78
Affect 4 1.07
Social Identification 1 .65
Social Identification 2 1.06
Social Identification 3 .98
Social Identification 4 .47
Social Identification 5 .97
Social Identification 6 1.03
Downloaded by [] at 16:43 07 March 2015
Results
The initial structural model demonstrated good model fit, x2(112, n 331)222.33,
p B.001, NNFI .96, CFI 0.97, RMSEA .055(90% CI: .044.065). Although the pro-
bability associated with the chi-squared statistic was lower than desired, the statistic
is problematic to interpret with large samples (Stephenson & Holbert, 2003) as were
present in this study. The non-normed fit index (NNFI), comparative fit index (CFI),
and root mean square error of approximation (RMSEA) are more robust statistics
less influenced by sample size, and demonstrated good fit (Byrne, 2007) of the over-
all model. Figure 2 illustrates the model, including standardized effects, and Table 2
provides correlations among key study variables.
We used SEM to test individual hypotheses in addition to the omnibus model.
Instructor credibility significantly predicted cognitive learning, so that participants
who perceived the teacher in the video as more credible scored higher on the post-
test, b .32, p B.001, supporting H1. Instructor credibility significantly predicted
educational affect, so that participants who perceived the teacher in the video as more
credible also perceived the course content as more valuable and engaging, b .33,
p B.001, supporting H2. Educational affect did not significantly predict cognitive
learning, so that there was no significant effect of a students’ perceived value on
post-test performance, b .02, ns, and thus H3 was rejected. Social identification
with commenters did not significantly predict perceptions of instructor credibility,
so that there was no significant effect of a student perceiving the commenter was of
a similar social group on the perceived trustworthiness and competency of the
presenter, b .01, ns, not supporting H4. Social identification with commenters
increased educational affect, so that participants who perceived themselves as closer
to commenters’ ingroup felt the content addressed in the lecture was more valuable,
Testing a Model of Online Learning 75
cantly predicted cognitive learning, but in the opposite direction as predicted, so that
as participants identified more with a commenter’s social group, they performed
more poorly on the post-test, b .25, p B.01, not supporting H6. Comment
positivity did not significantly influence educational affect, so that comments
favorable toward course material did not lead to more favorable attitudes regarding
course material, b .04, ns, and thus H7 was rejected. Comment positivity
significantly influenced perceived instructor credibility, so that participants exposed
to positive comments assessed the lecturer as more trustworthy and competent,
b .13, p B.05, supporting H8. Finally, comments favorable toward course material
led to greater social identification with the commenter and comments negative
toward course material led to reduced social identification with the commenter,
b .15, p B.05, supporting H9.
The model additionally infers several indirect effects not tested in the SEM analysis.
As direct effects of variables may be affected in the presence of a mediating variable
(Baron & Kenny, 1986), Sobel (1982) tests assessed whether the mediator affected
the influence of an antecedent to the dependent variable. Although bootstrapping is
often recommended as a technique for indirect effects, Sobel tests allowed testing for
indirect effects based on the b-values and standard errors provided by the SEM, rather
than conducting separate regression equations. Of the five possible indirect effects,
1. Instructor Credibility
2. Educational Affect .33*
3. Social Identification with Poster .01 .23*
4. Cognitive Learning .49* .10 .37*
*p B.001.
76 C. T. Carr et al.
three mediators were identified with a Sobel test. The effect of social identification
on cognitive learning was mediated by participants’ perceptions of instructor credibi-
lity (z 2.55, p B.05); the effect of comment positivity on educational affect was
mediated by participants’ perceptions of instructor credibility (z 2.06, p B.05); and
the effect of comment positivity on educational affect was mediated by participants’
perceptions of social identification with the commenter (z2.11, p B.05). Educa-
tional affect did not mediate the relationship between social identification with
commenter and cognitive learning (z .22, p .83); and social identification with the
commenter did mediate the effect of comment positivity on instructor credibility
(z .18, p B.05).
Discussion
Taken together, these results depict an interesting and somewhat surprising model
for online education, reinforcing some previous findings in educational research while
revealing novel processes and sources of influence in interactive media. Specifically, the
Downloaded by [] at 16:43 07 March 2015
and intrapersonal effects may be divorced from interpersonal and intergroup effects.
As such, it seems that comments to online lectures may serve as interpersonal, rather
than intrapersonal, sources of attitudinal influence.
Moreover, that the instructor credibility and social identification mediate the
relationship between positivity of comment and educational affect contributes to
understanding the process of engaging learners in online social contexts. Consistent
with previous research, positively valenced comments regarding course content more
positively influenced both instructor credibility (Edwards et al., 2009) and social
identification (Herring et al., 2002), but without any direct effect on educational
affect. Yet in both cases, students may experience cognitive dissonance (Festinger,
1957) when exposed to positive comments about the course or instructor while
possessing negative perceptions of the course, and vice versa. Therefore, the mediating
effects of instructor credibility and social identification on educational affect may be
explained as individuals’ intrapersonal attitudes toward course content are aligned
with others’ statements.
Future Research
The educational process is complex, and we would be naı̈ve to believe this model
comprehensively encapsulates the totality of online learning, regardless of its explana-
tory power. The model is admittedly limited, focused primarily on the psychosocial
effects of various sources of influence frequently occurring online. Yet, this limited
focus provides a baseline on which future research agenda may build and extend
to include additional influences, including those intrinsic, extrinsic, and systemic. For
example, future research may more narrowly study the intrapersonal effects of
individual students, such as technological self-efficacy and previous social media
use (LaRose, Mastro, & Eastin, 2001), on learning outcomes via social media. As an
additional example, this research focused on the communicative effects of messages
Testing a Model of Online Learning 79
and their senders on student receivers; however, additional studies into usability and
accessibility could consider design implications of the content delivery system on
student experience. It may be that YouTube, an interface and experience commonly
associated with entertainment yet used here to deliver educational stimuli, may not
be the ideal interactive tool for educators and students. Though this work presents
an initial model of interactive web-based learning, it is intended to be neither
comprehensive nor conclusive, and as such we encourage future research to push
beyond the foundational model to explore additional dimensions and outcomes
of online learning.
As an initial framework, this study contains shortcomings which future research
may seek to overcome to further validate, extend, or even refute its findings. Perhaps
most important for future research to address is its single-exposure design. Only
utilizing one video lecture may not have allowed individuals to fully engage in the
online learning experience as would occur in online classes or on-ground classes
heavily supplemented with social media. Consequently, future research with long-
itudinal educational exposures may afford greater environmental validity as well as
Downloaded by [] at 16:43 07 March 2015
future research may seek to validate this model with other outcomes equally relevant
in modern education practices.
Conclusion
In this research, we proposed and tested an initial model of online education via
social media. As more educators and institutions seek to design and host online
educational programs (Koehler & Mishra, 2005; Koehler, Mishra, Hershey, & Peruski,
2004), this model affords researchers an initial framework to understand how the
learning process may occur when education exists in an environment beyond the
sterile and controllable walls of brick-and-mortar classrooms. This model presents
scholars in the fields of communication and education an initial conceptualization
of critical factors influencing student learning via social media, specifically the
influences of instructors and online others. As educators increasingly turn to these
emergent web tools to enhance, supplement, and deliver educational content, this
research provides an initial model from which future work may draw.
References
Allen, I. E., & Seaman, J. (2011). Going the distance: Online education in the United States, 2011.
Newburyport, MA: Sloan Consortium.
Allen, M., Witt, P. L., & Wheeless, L. R. (2006). The role of teacher immediacy as a motivational
factor in student learning: Using meta-analysis to test a causal model. Communication
Education, 55, 2131. doi: 10.1080/03634520500343368
Anderson, C. (2008). The long tail: Why the future of business is selling less of more. New York, NY:
Hyperion Books.
Applebee, A. N., Langer, J. A., Nystrand, M., & Gamoran, A. (2003). Discussion-based approaches
to developing understanding: Classroom instruction and student performance in middle and
high school English. American Educational Research Journal, 40, 685730. doi: 10.3102/
00028312040003685
Testing a Model of Online Learning 81
Asch, S. E. (1946). Forming impressions of personality. The Journal of Abnormal and Social
Psychology, 41, 258290. doi: 10.1037/h0055756
Asterhan, C. S. C., & Eisenmann, T. (2011). Introducing synchronous e-discussion tools in co-
located classrooms: A study on the experiences of ‘‘active’’ and ‘‘silent’’ secondary school
students. Computers in Human Behavior, 27, 21692177. doi: 10.1016/j.chb.2011.06.011
Bach, S., Haynes, P., & Lewis-Smith, J. (2007). Online learning and teaching in higher education.
Buckingham, U.K.: Open University Press.
Baldwin, T. T., Bedell, M. D., & Johnson, J. L. (1997). The social fabric of a team-based MBA
program: Network effects on student satisfaction and performance. Academy of Management
Journal, 40, 13691397. doi: 10.2307/257037
Baron, R. M., & Kenny, D. A. (1986). The moderatormediator variable distinction in social
psychological research: Conceptual, strategic, and statistical considerations. Journal of
Personality and Social Psychology, 51, 11731182. doi: 10.1037/0022-3514.51.6.1173
Bechar-Israeli, H. (1995). From BBonehead To BcLoNehEAd: Nicknames, play, and identity
on Internet Relay Chat. Journal of Computer-Mediated Communication, 1(2). doi: 10.1111/j.
1083-6101.1995.tb00325.x
Berlo, D. K. (1960). The process of communication. New York, NY: Rinehart & Winston.
Bloom, B. S. (1956). A taxonomy of educational objectives: Handbook 1: The cognitive domain.
New York, NY: Longmans Green.
Downloaded by [] at 16:43 07 March 2015
Burke, M. (2009). The semantic web and the digital library. Perspectives, 61, 316322. doi: 10.1108/
00012530910959844
Burke, S. C., Snyder, S., & Rager, R. C. (2009). An assessment of faculty usage of YouTube as a
teaching resource. The Internet Journal of Allied Health Sciences and Practice, 7(1), 18.
Byrne, B. (2007). Structural equation modeling with AMOS: Basic concepts, applications, and
programming. New York, NY: Taylor & Francis.
Carr, C. T. (2010, June). The diametrics and modality of SIDE: A review and extension. Paper pre-
sented at the annual meeting of the International Communication Association, Singapore.
Carr, C. T., Vitak, J., & McLaughlin, C. (in press). Strength of social cues in online impression for-
mation: Expanding SIDE research. Communication Research. doi: 10.1177/0093650211430687
Chamorro-Premuzic, T., & Furnham, A. (2003). Personality predicts academic performance:
Evidence from two longitudinal university samples. Journal of Research in Personality, 37,
319338. doi: 10.1016/S0092-6566(02)00578-0
Clark, R. E. (1994). Media will never influence learning. Educational Technology Research and
Development, 42, 2129. doi: 10.1007/BF02299088
De Grez, L., Valcke, M., & Roozen, I. (2009). The impact of an innovative instructional intervention
on the acquisition of oral presentation skills in higher education. Computers & Education, 53,
112120. doi: 10.1016/j.compedu.2009.01.005
De Simone, C., Oka, E. R., & Tischer, S. (1999). Making connections efficiently: a comparison of
two approaches used by college students to construct networks. Contemporary Educational
Psychology, 24, 5569. doi: 10.1006/ceps.1998.0983
Douglas, K. M., & McGarty, C. (2001). Identifiably and self-presentation: Computer-mediated
communication and intergroup interaction. British Journal of Social Psychology, 40, 399416.
doi: 10.1348/014466601164894
du Toit, S., du Toit, M., Mels, G., & Cheng, Y. (2007). LISREL for Windows: PRELIS user’s guide.
Lincolnwood, IL: Scientific Software International.
Dumont, R. (1996). Teaching and learning in cyberspace. IEEE Transactions on Professional
Communications, 39, 192204. doi: 10.1109/47.544575
Dutton, W. H., Cheong, P. H., & Park, N. (2004). The social shaping of a virtual learning
environment: The case of a university-wide course management system. Electronic Journal
of e-Learning, 2(1), 6980.
Edwards, A., Edwards, C., Shaver, C., & Oaks, M. (2009). Computer-mediated word-of-mouth
communication on RateMyProfessors.com: Expectancy effects on student cognitive and
82 C. T. Carr et al.
behavioral learning. Journal of Computer-Mediated Communication, 14, 368392. doi: 10.
1111/j.1083-6101.2009.01445.x
Edwards, C., Edwards, A., Qing, Q., & Qahl, S. (2007). The influence of computer-mediated word-
of-mouth communication on student perceptions of instructors and attitudes toward
learning course content. Communication Education, 55, 255277. doi: 10.1080/0363452070
1236866
Ellis, K. (2000). Perceived teacher confirmation: The development and validation of an instrument
and two studies of the relationship to cognitive and affective learning. Human Communica-
tion Research, 26, 264291. doi: 10.1111/j.1468-2958.2000.tb00758.x
Festinger, L. (1957). A theory of cognitive dissonance. Stanford, CA: Stanford University Press.
Finn, A. N., Schrodt, P., Witt, P. L., Elledge, N., Jernberg, K. A., & Larson, L. M. (2009). A meta-
analytical review of teacher credibility and its association with teacher behaviors and student
outcomes. Communication Education, 58, 516537. doi: 10.1080/03634520903131154
Frymier, A. B., & Houser, M. L. (1999). The revised learning indicators scale. Communication
Studies, 50, 112. doi: 10.1080/10510979909388466
Gagné, R. M. (1972). Domains of learning. Interchange, 3(1), 18. doi: 10.1007/BF02145939
Gonzales, A. I., & Hancock, J. T. (2009). Identity shift in computer-mediated environments. Media
Psychology, 11, 167185. doi: 10.1080/15213260802023433
Harasim, L. (2000). Shift happens: Online education as a new paradigm in learning. The Internet
Downloaded by [] at 16:43 07 March 2015
LaRose, R., Mastro, D., & Eastin, M. S. (2001). Understanding Internet usage: A social-cognitive
approach to uses and gratifications. Social Science Computer Review, 19, 395413. doi: 10.
1177/089443930101900401
Lea, M., Spears, R., & deGroot, D. (2001). Knowing me, knowing you: Effects of visual anonymity
on self-categorization, stereotyping and attraction in computer-mediated groups. Personality
and Social Psychology Bulletin, 27, 526537. doi: 10.1177/0146167201275002
Leckart, S. (2012, March 20). The Stanford education experiment could change higher learning
forever. Wired, 20.
Lee, E.-J. (2006). When and how does depersonalization increase conformity to group norms in
computer-mediated communication?. Communication Research, 33, 423447. doi: 10.1177/
0093650206293248
Lee, M., & Youn, S. (2009). Electronic word of mouth (eWOM): How eWOM platforms influence
consumer product judgement. International Journal of Advertising, 28, 473499. doi: 10.2501/
S0265048709200709
Martin, M. M., Chesebro, J. L., & Mottet, T. P. (1997). Students’ perceptions of instructors’ socio-
communicative style and the influence on instructor credibility and situational motivation.
Communication Research Reports, 14, 431440. doi: 10.1080/08824099709388686
Mazer, J. P., Murphy, R. E., & Simonds, C. J. (2007). I’ll see you on ‘‘Facebook’’: The effects of
computer-mediated teacher self-disclosure on student motivation, affective learning, and
Downloaded by [] at 16:43 07 March 2015
Weimann, & S. Pingree (Eds.), Advancing communication science: Merging mass and inter-
personal processes (pp. 110134). Newbury Park, CA: Sage.
Reicher, S. D., Spears, R., & Postmes, T. (1995). A social identity model of deindividuation
phenomena. European Review of Social Psychology, 6, 161198. doi: 10.1080/14792779443
000049
Resnick, P., Kuwabara, K., Zeckhauser, R., & Friedman, E. (2000). Reputation systems.
Communications of the ACM, 43(12), 4548. doi: 10.1145/355112.355122
Rodriguez, J. I., Plax, T. G., & Kearney, P. (1996). Clarifying the relationship between teacher
nonverbal immediacy and student cognitive learning as the central causal mediator. Com-
munication Education, 45, 293305. doi: 10.1080/03634529609379059
Rovai, A. P., Wighting, M. J., Baker, J. D., & Grooms, L. D. (2009). Development of an instrument
to measure perceived cognitive, affective, and psychomotor learning in traditional and
virtual classroom higher education settings. The Internet and Higher Education, 12(1), 713.
doi: 10.1016/j.iheduc.2008.10.002
Russo, T. C., & Koesten, J. (2005). Prestige, centrality, and learning: A social network analysis of
an online class. Communication Education, 54, 254261. doi:10.1080/03634520500356394
Saupe, J. L. (1961). Some useful estimates of the Kuder-Richardson formula number 20 reliability
coefficient. Educational and Psychological Measurement, 21, 6371. doi: 10.1177/00131644
6102100105
Schrodt, P., Witt, P. L., Turman, P. D., Myers, S. A., Barton, M. H., & Jernberg, K. A. (2009). Instructor
credibility as a mediator of instructors’ prosocial communication behaviors and students’
learning outcomes. Communication Education, 58, 350371. doi: 10.1080/03634520902926851
Sherblom, J. C., Withers, L. A., & Leonard, L. G. (2009). Communication challenges and oppor-
tunities for educators using Second Life. In C. Wankel & J. Kingsley (Eds.), Higher Education
in Virtual Worlds: Teaching and Learning in Second Life (pp. 2946). Bingley, U.K.: Emerald
Group.
Shuell, T. J. (1986). Cognitive conceptions of learning. Review of Educational Research, 56, 411436.
doi: 10.3102/00346543056004411
Sobel, M. E. (1982). Asymptotic confidence intervals for indirect effects in structural equation
models. Sociological Methodology, 13, 290312. doi: 10.2307/270723
Stephenson, M. T., & Holbert, R. L. (2003). A Monte Carlo simulation of observable versus latent
variable structural equation modeling techniques. Communication Research, 30, 332354.
doi: 10.1177/0093650203030003004
Testing a Model of Online Learning 85
Walther, J. B., Liang, Y. J., DeAndrea, D. C., Tong, S. T., Carr, C. T., Spottswood, E. L., & Amichai-
Hamburger, Y. (2011). The effect of feedback on identity shift in computer-mediated
communication. Media Psychology, 14, 126. doi: 10.1080/15213269.2010.547832
Walther, J. B., Van Der Heide, B., Westerman, D., & Tong, S. T. (2008). The role of friends’ behavior
on evaluations of individuals’ Facebook profiles: Are we known by the company we keep?.
Human Communication Research, 34, 2849. doi: 10.1111/j.1468-2958.2007.00312.x
Wang, Z. (2007, November). The assessment of interpersonal attraction and group identification
in virtual groups. Paper presented at the annual meeting of the National Communication
Association, Chicago.
Wang, Z., Walther, J. B., & Hancock, J. T. (2009). Social identification and interpersonal com-
munication in computer-mediated communication: What you do versus who you are in
virtual groups. Human Communication Research, 35, 5985. doi: 10.1111/j.1468-2958.2008.
01338.x
Yang, D., Richardson, J. C., French, B. F., & Lehman, J. D. (2011). The development of a content
analysis model for assessing students’ cognitive learning in asynchronous online discus-
sions. Educational Technology Research and Development, 59, 4370. doi: 10.1007/s11423-
010-9166-1
Young, J. R. (2008, January 9). Thanks to YouTube, professors are finding new audiences.
The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/Thanks-to-
YouTube-Professo/381/
Zhao, Y., Lei, J., Yan, B., Lai, C., & Tan, S. (2005). What makes the difference? A practical analysis
of research on the effectiveness of distance education. The Teachers College Record, 107,
18361884. doi: 10.1111/j.1467-9620.2005.00544.x