- Original article
- Open access
- Published:
“Out of my control”: science undergraduates report mental health concerns and inconsistent conditions when using remote proctoring software
International Journal for Educational Integrity volume 19, Article number: 22 (2023)
Abstract
Efforts to discourage academic misconduct in online learning environments frequently include the use of remote proctoring services. While these services are relatively commonplace in undergraduate science courses, there are open questions about students’ remote assessment environments and their concerns related to remote proctoring services. Using a survey distributed to 11 undergraduate science courses engaging in remote instruction at three American, public, research-focused institutions during the spring of 2021, we found that the majority of undergraduate students reported testing in suboptimal environments. Students’ concerns about remote proctoring services were closely tied to technological difficulties, fear of being wrongfully accused of cheating, and negative impacts on mental health. Our results suggest that remote proctoring services can create and perpetuate inequitable assessment environments for students, and additional research is required to understand the efficacy of their intended purpose to prevent cheating. We also advocate for continued conversations about the broader social and institutional conditions that can pressure students into cheating. While changes to academic culture are difficult, these conversations are necessary for higher education to remain relevant in an increasingly technological world.
Introduction
The number of online course offerings has steadily increased over the past two decades, and the mass transition to remote instruction during the COVID-19 pandemic reinvigorated discussions about academic integrity in virtual spaces (Arbaugh 2014; Castro 2019; Eaton 2020; Kentnor 2015; Gamage et al. 2020; Picciano 2006). Early investigations into the impacts of remote proctoring services exposed inequities and concerns arising from these services, including racial inequity, discrimination against those with disabilities, reduction in mental health, and privacy concerns, both physical and virtual (Barrett 2021; Feathers 2021; Gin et al. 2021; Patil & Bromwich 2020; Woldeab & Brothen 2019). Students have previously indicated that they are wary of remote proctoring services, yet remote proctoring services were used prior to the pandemic and will likely continue to be used as remote instruction gains popularity (Alessio & Messinger 2021; Butler-Henderson & Crawford 2020; Langenfeld 2020; Milone et al. 2017; Nigam et al. 2021; Weiner & Hertz 2017).
The increased use of remote proctoring software in higher education courses presents an opportunity to explore these services, especially as they relate to student course experiences, student mental health, and the characteristics of students’ remote assessment environments. Meaders et al. (2020) found that asking students about their concerns revealed useful knowledge and possible ways to improve student experiences. We anticipate that identifying concerns related to remote proctoring services can reveal trends that instructors and institutions can address to create more inclusive and comfortable course experiences.
We sought to identify characteristics of students’ assessment environments and to determine student concerns specifically related to remote proctoring services. We explored three research questions: 1) In what physical environments are students taking their online, remote exams? 2) what concerns do students have related to remote proctoring of course assessments? and 3) do instructors and students assume different rates of cheating during in-person and remote exams? Using these research questions as a guide, we reviewed the literature related to online learning, remote proctoring services, and student mental health, emphasizing the interplay of these factors on undergraduate students’ course experiences resulting in a conceptual framework for this study. Prior work that addresses students’ “mental health” does not regularly provide clear definitions for the term. For the purposes of our study, we considered “mental health” as defined by the American Psychological Association as “a state of mind characterized by emotional well-being, good behavioral adjustment, relative freedom from anxiety and disabling symptoms, and a capacity to establish constructive relationships and cope with the ordinary demands and stresses of life” (American Psychological Association, n.d.). We addressed these research questions through a multi-institutional survey of both students and instructors. All institutions involved in this study were in the United States.
Literature review & conceptual framework
While online learning has become commonplace in higher education courses, this form of instruction is distinct from the emergency remote teaching experienced during the pandemic. Emergency remote teaching is characterized by an expedited response to emergency scenarios, little training in instructional strategies and available resources, and lack of student intent to enroll in remote courses (Eaton 2020; Ferri et al. 2020; Hodges et al. 2020). Conversely, online learning is an intentional instruction method, and faculty members are sometimes trained in effective pedagogical techniques, supported by their institution, and students anticipate the course modality prior to enrollment (Eaton 2020; Hodges et al. 2020; Prince et al. 2020). Both online learning and emergency remote teaching are not confined to the current era of remote learning, as these strategies have been used to reach more potential students and to respond to emergency situations (Kentnor 2015; Picciano 2006). The pandemic necessitated emergency remote teaching, but our sample does not fall within this category of instruction. The courses involved in our data collection were remotely delivered in Spring 2021 but without emergency transition, so we will refer to this teaching method as remote instruction with students engaging in online learning. The pandemic provided ample opportunity to explore our research questions, but the scope of our data and analyses could apply to online learning and remote instruction beyond the context of COVID-19.
The switch to emergency remote instruction during the pandemic revived efforts and discussions about how to best maintain academic integrity in online undergraduate courses, no matter the context (Eaton 2020; Gamage et al. 2020). Academic integrity is valued in higher education to promote self-efficacy, reinforce good habits for future behavior, and foster a fair environment for all students (Gilmore et al. 2015; Macfarlane et al. 2014). Several studies determined that academic dishonesty may be more likely to occur in online learning environments, but conversely, these behaviors may be easier to recognize and less likely to contribute to academic success and progress (Arnold 2016; Eaton 2020; Stuber-McEwen et al. 2009; Watson & Sottile 2010). The literature also indicates that some potential benefits of remote courses can be negated by suboptimal remote assessment environments in which students are subjected to frequent distractions (Beatty et al. 2022; Fask et al. 2014; Hollister & Berenson 2009).
To date, research has focused on in-person assessment environments and their impacts on student performance, but less is known about the role of remote assessment environments. One important element of remote proctoring is the need for a quiet, distraction-free environment. This is not always feasible, and those who are unable to find quiet environments may be susceptible to distractions (Driessen et al. 2020). In general, the literature indicates that undergraduate students struggle with distractions and multitasking in both traditional and remote environments, and students’ ability to adapt to virtual learning varies by both personal traits and the quality of their individual environments (Gonzáles-Gutierrez et al. 2022; May & Elder 2018; Wu 2015; Wu & Cheng 2019). Students also tolerate self-produced distractions (e.g., texting on a cell phone) more than external distractions (e.g., noises produced by others), and specifically, external distractions impact learning and subsequent academic performance on assessments (Blasiman et al. 2018; Drozdenko et al. 2012). Furthermore, greater variation in course-wide assessment grades has been identified in online learning environments, but researchers attribute these results to uncontrolled aspects of remote assessment environments instead of increases in cheating behavior (Hollister & Berenson 2009).
Prior work has shown that undergraduate science students value certain learning environments but incorrectly predict what their courses will be like (Hassel & Ridout 2018; Kuh et al. 2006; Lowe & Cook 2003). Students often predict more contact with instructors, smaller class sizes, and lighter workloads than actually experienced in their courses (Akiha, 2018; Hassel & Ridout 2018; Lowe & Cook 2003, Meaders et al. 2019). The misalignment between expectations of the course and the actual learning environment can impact course grades and attrition rates (Eagan et al. 2014; Geisinger & Raman 2013; Lisberg & Woods 2018; Watkins & Mazur 2013). The emergency remote teaching environment spurred by the COVID-19 pandemic likely created further dissonance between students’ expectations of their courses and the reality of virtual instruction. Understanding the role of different factors, such as remote proctoring and students’ assessment environments, on these perceptions is important to work towards positive course experiences and meeting learning outcomes.
Without an obvious consensus on the best strategies to maintain academic integrity in remote courses, many instructors and institutions have relied on remote proctoring services, such as Proctorio, Respondus LockDown Browser, and HonorLock, to discourage cheating in undergraduate science courses (Langenfeld 2020; Nigam et al. 2021). Remote proctoring services monitor examinee’s computers and physical spaces, including recording the computer screen, the physical environment, audio, and eye movement, to monitor cheating behavior. However, the effectiveness and reliability of these services has been questioned, and it is often unclear how effective these services are at discouraging and detecting cheating (Barrett 2021; Bergmans et al. 2021; Nigam et al. 2021). These services may discriminate against students of color and students with disabilities, either by necessitating limited movement, failing to recognize darker skin, or perpetuating high-stakes assessment environments (Feathers 2021; Gin et al. 2021; Kolski & Weible 2018; Patil & Bromwich 2020; Woldeab & Brothen 2019). Students who may be discriminated against by remote proctoring services are those who already face additional stresses in their higher education such as People of Color, students with mental illness, and those with a physical, developmental, or learning disability (Lisnyj et al. 2021). Combining these concerns with unknown reliability, the use of remote proctoring services raises ethical questions about their impact on the student and the role of variable factors, such as environments, on performance and learning outcomes.
Generally, undergraduate students struggle with distractions in both traditional and online environments, and the ability to adapt to a remote space depends on individual circumstances, like students’ traits and environments, and structural components of the course, like pedagogy and available resources (Blasiman et al. 2018; Drozdenko et al. 2012; May & Elder 2018; Wu 2015; Wu & Cheng 2019). While there has been some exploration into the role of remote assessment environments on performance (Beatty et al. 2022; Fask et al. 2014), only a few studies have alluded to the impacts of remote proctoring on mental health (Chaudhry et al. 2022; Eaton & Turner 2020; Kharbat & Abu Daabes 2021) and there is even more limited data from an American context. Eaton and Turner’s (2020) rapid review of the remote proctoring literature explicitly calls for further investigation into the relationship between remote proctoring and student mental health.
Based on the interactions between student experiences, remote proctoring services, online learning, and remote assessment environments, we developed a conceptual framework that illustrates the relationships between these topics that are hypothesized by the literature (Fig. 1). Our conceptual framework suggests that the widespread transition to emergency remote teaching has expanded discussions about academic integrity in online spaces, and combined with virtual environments, drives the use of remote proctoring services. We used this conceptual framework to guide our research questions, incorporate background knowledge, and contextualize our findings in the broader literature (Luft et al. 2022). Because the impact of remote proctoring services on student course experiences and perceptions are unknown, we sought to explore students' views about these services. We characterized the specific features of students’ remote assessment environments and students’ concerns about remote proctoring services. To address our three research questions, we developed and distributed three surveys to 1) recruit instructors and gain insight into how they use remote proctoring services in their courses and then 2) receive insights from the students in these courses about their experiences. At the conclusion of our manuscript, we incorporate our findings into our conceptual framework to showcase hypotheses that require further investigation.
Methods
Survey Development
We used three surveys as part of this study. Our primary source of data was a survey given to students in biology courses that engaged in some form of remote instruction during the period of data collection. We also surveyed the course instructors before and after surveying their students to gain contextual information about the courses. The first instructor survey was given to course instructors to identify courses to participate in the study. Before the semester started, faculty members were asked about their remote proctoring practices and their perceptions of the percent of students who were cheating in their courses. The second instructor survey was distributed after the semester ended, along with a report containing aggregated, anonymized data from their own courses for reflection purposes.
The student survey questions were developed by the authors, several of whom are faculty members and some of whom are part of a Research Coordination Network supported by the National Science Foundation, called EDU-STEM. EDU-STEM is composed of faculty members, postdoctoral scholars, and graduate students from a variety of institutions across the United States who are dedicated to researching inclusive instruction (Thompson et al. 2020). We initially developed questions to reflect the concerns and experiences that authors heard from undergraduate students enrolled in their courses. Also, we reviewed literature related to cheating and to the use of proctoring services to inform question wording, add additional questions, and remove any questions that were vague or uninformative. After initial development, several undergraduate students who were members of the authors’ research groups were asked to review the questions and provide feedback. This feedback was incorporated into the final version of the student survey. Student surveys were administered after completing at least one remotely proctored assessment in the course and surveys remained open for 5–10 days depending on the course. All surveys are provided in Additional file 1: Appendix A and were distributed via Qualtrics (Qualtrics, 2021). This work was determined exempt from full review by the University of Minnesota Institutional Review Board (STUDY00012384), the University of South Alabama Institutional Review Board (#1,544,421–5), and Auburn University Institutional Review Board (AU 18–349 EP 1811).
Participant context and recruitment
Data for this study were collected in the Spring semester of 2021. Instructors and students were recruited from EDU-STEM-member institutions and from the authors’ own institutions and departments via invitation email (Thompson et al. 2020). This resulted in 11 participating undergraduate courses from three public, research-focused institutions across the country, including two very high research activity universities and one high research activity university (Carnegie Foundation for the Advancement of Teaching 2021). These STEM courses included introductory and upper-level courses in biology and instructors reported using Proctorio, Respondus LockDown Browser, or both as a remote proctoring service for the courses. Students were recruited from these courses through email, and some, but not all, instructors incentivized student participation with a small amount of course extra credit (< 1% of total course points). Those students who did not want to participate in the study but did want to receive course credit were able to select “No, I do not consent to participate in this study” and still receive the extra credit. The ten course instructors themselves did not receive any incentive for participation aside from receiving aggregated, anonymized data from their own course. Student participants were asked to complete survey questions relating to remote assessment environments, experiences with remote proctoring services, concerns about remote proctoring services, and demographic information. The survey instructions did not provide definitions for any terms used within (e.g., “concerns) as we wanted to remain open to students’ interpretations within the long answer responses. We provide student respondents’ demographic information in Additional file 1: Appendix B.
Data collection and analysis
After collecting survey responses, we cleaned the data. Specifically, unnecessary information, such as completion time and location information, were removed from the dataset, and identifying information was replaced with a unique number identifier for each participant. When there were duplicate responses, the more complete response was kept, and duplicates across courses were removed. In the case that two responses from the same participant were of similar completion, we randomly selected one response to retain. Additionally, responses were removed if students were under 18 years old, did not consent to participating in the study, or only completed the identifying questions for course credit; this left a total of 375 responses (full data on number of participants and number of potential participants in Additional file Table 1E). The number of responses to each individual question varies throughout, as participants could skip any portion of the survey, and these data are reported as (n = total number of respondents to question, percent of respondents who gave the referenced response). One question asked participants to report the extent to which they were concerned about elements of remote proctoring services on a three-point Likert scale, and only respondents who addressed all ten concerns were included in our analysis of this question to capture more complete data, totaling 264 responses (Additional file 1: Appendix C).
For qualitative analyses of open-ended questions, every response was kept regardless of the completion of the rest of the survey due to limited participation. Two coders (AKL and AKP) reviewed all responses using open coding techniques (Gibbs 2018; Hemmler et al. 2020; Saldaña 2021; Stemler 2004) for recurrent ideas, then created a codebook to capture process codes for each item. The coders then independently coded each response and adjusted the codebook as needed in an iterative process. Finally, a consensus was reached for each response. After consensus, codes with fewer than three responses were removed from the codebook and added to the “other” category to retain a moderate number of the most useful codes. This resulted in removing one code from the assessment environment codebook and three codes from the top concern codebook. The final assessment environment codebook contained 19 codes, while the final top concern codebook contained 25 codes. All codes, definitions, and examples can be found in Additional file 1: Appendix D. Quotes in these tables and in the manuscript have been lightly edited for grammar and clarity and have been selected to represent the range of responses while protecting participant identities.
We employed descriptive statistics to capture trends in student concerns, experiences, and environments. We constructed figures in JMP 15.2 and used the program to perform two sample t-tests, assuming unequal variance, to analyze both student and instructor participants’ estimations of cheating using the same software (JMP, North Carolina, JMP, Version 15.9, 2021).
Results
To better understand undergraduate science students’ experiences with remote proctoring services, we collected information on (1) the characteristics of their remote instruction and assessment environments, (2) the most concerning aspects of using remote proctoring services, and (3) how instructors and students vary in their estimation of students cheating during in-person or remotely proctored exams.
Characteristics of assessment environments
We asked participants to report the number of other individuals in their environments when completing coursework or assessments. Participants were able to select a number ranging from 0 to 5 + . Most participants reported living with at least one other individual, with the most common range encompassing one to three other individuals in the same space (Fig. 2).
We also asked participants to elaborate on their remote assessment environments in an open-ended survey question. We prompted respondents to include details such as distractions, the quality of technology used to complete assessments, and any relevant information about the physical space itself. The majority of participants reported testing in suboptimal conditions (n = 305, 53.4%). Few participants reported having sufficient environments (n = 305, 16.7%), and roughly a third did not qualify their assessment environment (n = 305, 29.8%) (Additional file 1: Appendix D). The most common suboptimal conditions included noise and distractions from other individuals in the space (n = 305, 44.3%), and poor internet quality (n = 305, 21.3%) (Additional file 1: Appendix D). For example, one respondent reported, “My internet is good but there are often distractions throughout the entirety of my exam. This includes: my dog, family members talking, trucks and cars passing by, children screaming outside (playing),” while another stated, “Our internet quality is low and having four roommates means that there is always noise.” Additional summary data about the codes and representative quotes about respondents’ remote assessment environments can be found in Additional file 1: Appendix D.
Concerns related to remotely proctored exams
We asked participants to sort ten concerns about remote proctoring services on a three-point Likert scale with the categories “Not concerned,” “Somewhat concerned,” and “Very concerned.” Participants placed each concern in a box labeled by level of concern within the survey, but participants did not have to sort each item. Of those who sorted all ten concerns, students reported being “Very concerned” about being wrongfully flagged for cheating by the proctoring software (n = 264, 68.3%), followed by having a technological difficulty (n = 264, 58.0%) and being wrongfully flagged for cheating by the professor (n = 264, 53.1%) (Fig. 3). The reported concerns for all 375 responses, including those that did not sort all concern elements, can be found in Additional file 1: Appendix C. While respondents were able to write in additional concerns to include on the scale, none chose to do so.
After sorting the concern items, we asked participants to identify their top concern related to remote proctoring services and describe what aspects made it most concerning. The most common responses were the possibility of being wrongfully accused of cheating (n = 285, 74.1%) and encountering technological difficulties (n = 285, 31.3%) (Additional file 1: Appendix D). Participants also reported dealing with emotional distress when using proctoring software (n = 285, 21.4%), commonly citing increased feelings of stress, anxiety, and general worry when completing remotely proctored exams. Each of these ideas are described below.
Academic integrity
Themes about cheating were prevalent throughout our data and being wrongfully flagged for cheating was the most commonly coded item when respondents elaborated on their top concerns (Additional file 1: Appendix D). We asked students and instructors to estimate the percent of students cheating on exams, both proctored in-person and remotely, to explore the perception of cheating and the influence of proctoring services on these behaviors. Averaging across responses, student respondents (n = 342) estimated that 21.2% of their peers were cheating during remotely proctored exams, while respondents (n = 323) estimated that 11.6% were cheating during in-person exams. The difference in means was statistically significant according to a two-sample t-test, assuming unequal variance with a tested normal distribution of means (t587 = 7.54, p < 0.001) (Fig. 4). Additional information about students’ perceptions of cheating by institution can be found in Additional file 1: Appendix E. Compared to student respondents, instructors estimated lower percentages of students cheating in both proctoring environments. On average, instructors estimated that 12.2% of students were cheating during remotely proctored exams compared to 5.6% during in-person exams, but this difference was not statistically significant, likely due to the small sample size (t18 = 1.47, p = 0.175) (Additional file 1: Appendix E).
When asked to elaborate on their top concern about remote proctoring services, participants frequently mentioned cheating, such as concerns about being wrongfully accused of cheating or their classmates cheating. The fear of being wrongfully accused of cheating was closely tied to a perceived lack of control over the remote assessment environment and/or professors’ decisions about cheating, as exemplified by this response, “Since I take most of my exams in my dorm, I'm constantly worried that the proctoring software will flag my test for cheating because of someone screaming, or talking in the hallway, or coming into view of the camera. My testing environment is so out of my control, that at any moment I think I'll be kicked out of the exam for supposed cheating.”
Concerns about being wrongfully flagged for cheating ranged from short-term impacts, such as being locked out of the exam, to long-term, wide-reaching consequences. For example, one participant responded, “My biggest fear all year has been being falsely accused of cheating either by the proctoring software or a professor. Cheating is a serious accusation that can cause serious long-term effects and essentially ruin a student's entire future.” In addition to these concerns, a small number of participants (n = 8) indicated that they were concerned or frustrated by classmates potentially cheating, and these concerns were largely focused on the possible impact on the course curve or individual grades. One respondent replied, “I hear [from] friends, peers, and social media that most people do use their phones and other things to cheat on exams, and it really worries me about not doing as well.” Furthermore, another respondent questioned the efficacy of remote proctoring services, concisely summarizing, “If a student is going to cheat, neither in person nor online proctored tests will stop them.”
While concerns about being wrongfully flagged for cheating were common amidst respondents of all demographic groups, students who self-identified as having a disability, medical, and/or mental health concern (n = 22) frequently connected their experiences with heightened concerns about remote proctoring services. (Note that respondents provided this information unprompted and therefore this may be a significant underestimate of the students in our sample who self-identify in this way.) These concerns were linked to behaviors, like body movements or wandering gazes, that may be seen as dishonest during proctored exams. One respondent stated, “As someone with ADHD, I often wonder my gaze and just look around and random things in order to think, and I am always worried that I will get flagged for looking around.” Respondents with disabilities reported feeling that the concern about being accused of cheating negatively impacted their ability to perform on exams, with one participant emphasizing, “I have been diagnosed with anxiety, and this type of testing makes it so much harder for me to succeed in my classes. A lot of the time during these timed tests, my mind is not able to focus on the exam itself because [of potential distractions]. All of these kinds of thoughts overtake my ability to complete the exam to the best of my ability.”
Encountering technological difficulties
Our results showed that technological difficulties were a common concern, aligning with previous research surrounding online learning and remote proctoring services (Beaudoin et al. 2009; Kauffman 2015; Milone et al. 2017;Rasheed et al. 2020). As such, we asked participants to report their previous experiences with technology and remotely proctored exams. The most common concerns included an internet connection that was too slow (n = 259, 37%) and proctoring software that was unable to detect the student’s face (n = 259, 21%) (Fig. 5). Respondents commonly reported concerns about technological difficulties when using remote proctoring services that were based on negative past experiences (Fig. 3). One respondent stated, “I have a pretty old laptop that [the proctoring software] sometimes does not like to deal with. My biggest concern is having some sort of technical issue with my laptop or the [proctoring] software that causes me to have problems with my exam.” Another participant simply shared, “Internet quality is unreliable and out of my control.”
Emotional distress
We discovered the emerging theme of reported emotional distress when using remote proctoring services (n = 285, 21.4%; Additional file 1: Appendix D), but the degree of distress and its impact varied among respondents. The theme of emotional distress was self-defined by the authors as students reporting negative impacts on mental health either long-term or in the moment. Some students identified new concerns and others mentioned the compounding effects of existing mental health disorders and testing. For example, one participant referenced their existing mental health and the negative impact of proctoring, stating, “I need to fidget and move when doing anything; asking me not to is just making my testing anxiety worse.” Participants also indicated generalized concerns about being remotely proctored, while others more acutely expressed their emotional distress while using proctoring services. One student said, “I am constantly scared while taking proctored exams on [the proctoring software] that I am going to be wrongfully accused and then have my grade in that class, my GPA, or my career affected by something a software program misidentified.” This response also exemplifies the wide-reaching, perceived impacts of being wrongfully flagged for cheating. While the source, intensity, and duration of emotional distress varied in responses, the theme of negative impacts on mental health spurred by proctoring services was present throughout survey responses.
Emotional distress was closely linked to students who identified as having a disability, medical, and/or mental health concern. Of the 22 students who mentioned their disability, 15 also emphasized the negative effects on mental health of using remote proctoring services during assessments. One respondent stated, “As someone with mental health problems, sometimes being forced to focus on looking like you aren't cheating, when you aren't, is super overwhelming.” As before, emotional distress is closely linked to the fear of being wrongfully flagged for cheating, increasing feelings of stress and worry. Students with disabilities indicated that the remote proctoring services directly impacted their ability to effectively complete assessments in remote environments. One participant stated, “[Thinking through every action and movement] makes my anxiety act up and impairs my ability to reason/think through my test. In fact, I usually finish [proctored] tests as early as possible in order to experience this for as little time as possible.”
Instructor role in remote proctoring perceptions
Finally, students were asked to indicate if their instructors explained how and why remote proctoring services were used in the course and to elaborate on the impact of these explanations on their experiences. Most respondents indicated that their instructors did explain both how and why a proctoring service was used in the course (Table 1). Of those who indicated that their instructor explained how the software was used, the majority reported that the explanation had no impact on how they felt about using remote proctoring in the course. Even with instructor explanations, students report feeling uncertain about the software itself, a concern highlighted in the quote, “I don't know how the software works to flag things as cheating.”
Many instructors indicated their intent to better explain the reasoning behind using remote proctoring services in a reflection survey after viewing their own course’s results. Of the six instructors who responded to the reflection survey, four stated that they would reflect on how they communicate their reasoning to students and dedicate more time to explaining this rationale to students. For example, one instructor stated, “I think a major change I will make, given these results, is to think about how I explain the purpose of [a proctoring software] to students, as well as how I will use it to ensure the exam is fairly administered.” Instructors indicated that they would use these strategies to assuage student concerns, especially related to being wrongfully flagged for cheating.
Discussion
We discovered that many undergraduate science students complete coursework and assessments in suboptimal environments and hold legitimate concerns about remote proctoring services. Together, these create negative course experiences and worsen mental health. Suboptimal assessment environments were characterized by noise, distractions, and inconsistent internet quality (Figs. 2 and 5). Respondents also reported being wrongfully accused of cheating and experiencing technological difficulties as the most concerning aspects of using remote proctoring services (Fig. 3). Finally, participants noted negative emotional impacts, such as increased stress and test anxiety, when completing assessments in this format. These results indicate that remote proctoring services may impact student course experiences and mental health, potentially exacerbating inequalities in assessment scores and experiences between students (Fig. 6). Here, we contextualize the main themes that emerged from our findings with existing literature and make recommendations for instructors and future research in this area.
Variable environments perpetuate unequal educational experiences
The majority of respondents reported testing in suboptimal environments characterized by high levels of noise, distractions, and lack of quality internet access (Fig. 2, Additional file 1: Appendix D). Participants also identified these themes as concerns for using remote proctoring services, specifically mentioning uncontrollable elements of their environments such as external distractions and noise (Fig. 3, Additional file 1: Appendix D). These results indicate that students are testing in unique remote environments, each with their own characteristics and challenges, and indicate that impacts of these testing environments are inextricably linked to proctoring software. The impact of remote proctoring services appears to be mediated by students’ remote assessment environments, and instructors should thoughtfully consider the variability among environments and how they impact students when choosing to use remote proctoring services.
Unequal learning and assessment environments may be contributing to disparities in educational experiences, impacting students’ learning gains and course grades (Hollister & Berenson 2009; May & Elder 2018). While environmental distractions may impact students differently during in-person assessments, in-person scenarios are at least in the same environment. For example, student performance on an exam taken in a common lecture hall will not be subject to disparate impacts of internet access or noisy housemates. This benefit only exists, however, if individual students can access accommodations, such as extended time (Gregg 2012).
The quality of remote assessment environments depends on elements outside of students’ control, and currently, there are few effective strategies to ensure equitable conditions for all students. Because our results show that students are experiencing unequal, suboptimal assessment environments, we urge instructors to reconsider the use of high-stakes testing in their remote learning courses. High-stakes testing already disadvantages specific demographic groups, such as women and underserved students, due to increased test anxiety and stereotype threat (Ballen et al. 2017; Cotner & Ballen 2017). Therefore, implementing more formative or mixed-assessment methods both reduce assessment performance disparities and may eliminate the need for remote proctoring services altogether. By relying on other assessment strategies (e.g., group participation, low-stakes quizzes and homework assignments, formative assessments) to evaluate student learning, instructors would not need to rely on high-stakes testing as measures of student learning outcomes. Decreased reliance on high-stakes testing and remote proctoring services would also remove concerns about remote testing environments. To reduce the burden of variable testing environments, instructors could guide their students towards resources, such as private study rooms or dedicated remote testing locations, that would mitigate concerns about the environment. Students may not be aware of these resources, and instructors play a key role in informing their students about potential alternatives.
The culture and conversations of cheating
Respondents identified being wrongfully flagged for cheating as the most concerning element of remote proctoring and were particularly concerned about some uncontrollable aspect of the environment, such as movements or loud noises (Additional file 1: Appendix D). This result aligns with findings from Chaudhry et al. (2022) who interviewed 14 Canadian students and also reported on fear of being mis-flagged for cheating increasing students’ stress during exams. Simultaneously, participants believed that their peers were more likely to be cheating during remotely proctored exams (Fig. 4). It is important to note that this finding is based on students’ perceptions rather than actual differences in the rate of cheating and that students were unable to directly witness their peers’ cheating due to the remote nature of the exams.
These results imply that students do not believe that remote proctoring services dissuade cheating behaviors. When combined with proctoring services’ inconclusive ability to detect cheating (Barrett, 2021; Bergmans et al. 2021), remote proctoring services may not be an effective strategy to maintain academic integrity and may even contribute to unequal course experiences. To date, research has failed to show that cheating occurs more in remote environments, regardless of remote proctoring usage (Fask et al. 2014; Hollister & Berenson 2009; Stuber-McEwen et al. 2009; Watson & Sottile 2010). Fask et al. (2014) and Hollister & Berenson (2009) used statistical modeling to identify cheating in remote environments, and both groups were unable to attribute exam score variation to cheating (Fask et al. 2014; Hollister & Berenson 2009). Given the inability to confirm increased cheating in remote environments, it is important to consider the negative impacts on students’ mental health and experiences weighed against the unlikely scenario of consistently identifiable academic misconduct.
The frequent theme of cheating raises important questions about the social and institutional factors that influence student behaviors. While the literature about academic dishonesty largely focuses on individual traits, research has shown that social factors, like peer achievement and pressure for academic success, influence the desire to engage in cheating behavior (Krou et al. 2021; Wilkinson 2009). Institutional and academic structures are also frequently cited as pressures to cheat, such as the need to maintain certain grades to receive scholarships (Krou et al. 2021; Passow et al. 2006). The abundance of external pressures likely contributes to students’ concerns about cheating (e.g., Genereux & McLeod 1995; Butler et al. 2022), and we advocate for discussion centering on the external culture that influences cheating behavior before implementing remote proctoring services. From an instructor’s perspective, creating a culture where cheating is discouraged through the use of low-stakes test formats, alternative forms of assessment, peer accountability, and lessened pressure for academic success could eliminate the need for third-party management of academic integrity. Additionally, clear communication with students could mitigate these concerns, such as emphasizing that instructors, not software, ultimately make decisions about academic misconduct. Addressing cheating and its related pressures is a difficult discussion and task, but it is necessary to review assessment strategies to successfully educate students in online environments.
Implications on student mental health
In addition to concerns about being wrongfully flagged for cheating, many students referenced negative emotional experiences while using remote proctoring services (Additional file 1: Appendix B). These experiences ranged from acute feelings of stress and anxiety to compounding effects with previous mental health conditions. Students frequently reported increased levels of worry, stress, and test anxiety when using remote proctoring services (Additional file 1: Appendix B). Given the alarming rates of worsened student mental health, especially for women and students of color (Lee et al. 2021; Ketchen Lipson et al. 2015; Scherer & Leshner 2021), remote proctoring services may unnecessarily heighten emotional distress (Chaudhry et al. 2022; Eaton & Turner 2020). Additionally, previous research indicates that test anxiety disproportionately affects students with disabilities, women, and students of color (Ballen et al. 2017; Cotner et al. 2020; Salehi et al. 2019; Salehi et al. 2021; Thames et al. 2015; von der Embse et al. 2018; Woods et al. 2010; Whitaker Sena et al. 2007). General student mental health is also closely tied to test anxiety in higher education. Test anxiety is described as negative emotional or physiological responses to evaluative situations and has been shown to negatively impact academic performance and general intent to pursue a major (Barrows et al. 2013; Cassady & Johnson 2002; England et al. 2017; Kolski & Weible 2018; Sommer & Arendasy 2014). However, test anxiety does not fully encompass the negative mental health impacts participants reported experiencing while using remote proctoring services (Chaudhry et al. 2022). Some student participants perceived this heightened stress as a significant factor in their assessment performance, which may be further evidence that these services are a barrier to equal educational experiences and sufficiently representative course grades for many students.
Other considerations and future directions
There are several considerations to be mindful of when reviewing these data and results. First, a limited number of institutions were surveyed, and all institutions were in the United States. As such, findings are limited to this context. All three universities are classified as having high to very high research activity (Carnegie Foundation for the Advancement of Teaching 2021). Therefore, our sample could not capture differences in students’ experiences by institution type, such as in community colleges and other teaching-focused institutions, or any difference. Future research should aim to capture experiences outside of the scope of research-intensive institutions (Thompson et al. 2020). Second, the institutions surveyed in this study also have predominantly white student populations, and our sample did not reveal conclusive differences by demographic groups. However, the absence of these trends does not mean they do not exist, and future research should aim to include diverse student populations to explore the interplay between identity and experience. Third, because we were primarily interested in students’ perceptions and experiences, we defined very few terms for the participants. As a result, participants may have interpreted words such as “concern” differently. Finally, the role of the pandemic should be considered when assessing students’ concerns. The COVID-19 pandemic exacerbated the student mental health crisis and increased student stress levels (Correia et al. 2022; Lee et al. 2021), and student concerns about remote proctoring software and their remote assessment environments may have been amplified by the broader stressful circumstances of the world.
Additionally, our survey did not specifically collect systematic demographic data about disability aside from scenarios where students voluntarily self-identified in their responses. Interactions between disabilities and distance learning could greatly impact students’ experiences and their specific concerns about remote proctoring software. While literature about test anxiety and online proctoring exists (Cassady & Johnson 2002; Barrows et al. 2013; Kolski & Weible 2018; Stowell & Bennet, 2010), less is known about the intersection of test anxiety, remote proctoring anxiety, and learning disabilities. Students with learning disabilities, such as attention deficit hyperactivity disorder, already report higher rates of test anxiety in traditional evaluative settings (von der Embse et al. 2018; Woods et al. 2010; Whitaker Sena et al. 2007) and face discrimination during in-person science instruction (Gin et al. 2022; Hutcheon & Wolbring 2012; Lee 2011), so exploring this relationship may reveal inequities in students’ environments and experiences perpetuated by the use of remote proctoring services.
Another area for future investigation centers around the role of instructors when discussing remote proctoring services and remote learning. While most students reported that their instructors explained how and why they use remote proctoring services, students did not report increased positive feelings towards them, with most reporting no impact on their opinion and some reporting feeling worse (Table 1). If instructors feel that remote proctoring is necessary to maintain academic integrity, identifying the best ways to introduce and discuss remote proctoring could mitigate students’ concerns and better equip instructors to address students’ experiences. Instructors may be explaining their rationale to students but standardizing and assessing the content of these explanations and how they are delivered could modify students' attitudes towards remote proctoring. An additional hypothesis worth exploring is if instructors’ explanations can result in students’ beliefs that more cheating is occurring. Essentially, by discussing cheating and remote proctoring are instructors inadvertently making students believe that more cheating is occurring?
Finally, understanding student perceptions of grading and cheating in their courses could provide additional context to our results. Of the students who reported being concerned about their classmates cheating, some cited concerns about the impact of others on their own performance, grade, and class distribution. Characterizing students’ perceptions of curving, grade distributions, and the aftermath of breaching academic integrity could provide context to why students equate remote proctoring services with concerns of being wrongfully accused of cheating. While it is difficult to design studies to assess actual cheating, we could explore cheating from an instructor perspective, asking how often and under what conditions proctoring services actually aid instructors in identifying cheating situations.
Conclusions
Our research aimed to characterize the remote assessment environments of undergraduate science students and to understand their concerns and perceptions about remote proctoring services in their courses. We discovered that the majority of students report testing in suboptimal environments, and that many hold concerns about experiencing technological difficulties and being wrongfully flagged for cheating when using remote proctoring services. While we collected data in the spring of 2021 during the COVID-19 pandemic, we believe that these data are relevant to all online learning environments. Before implementing remote proctoring services as a means to maintain academic integrity, we caution that these services may negatively impact course experiences and student mental health and may contribute to unequal educational experiences. While it is difficult to address the social and institutional pressures that contribute to cheating behaviors, it is necessary to reevaluate undergraduate assessment strategies to educate students in an evolving online world.
Availability of data and materials
To protect identities of participants and faculty, data will not be made publicly available. However, limited access may be granted via request to the corresponding author.
Abbreviations
- STEM:
-
Science Technology, Engineering, and Math
- EDU-STEM:
-
Equity and Diversity in Undergraduate STEM
- COVID-19:
-
Coronavirus disease 2019
References
Akiha K, Brigham E, Couch BA, Lewin J, Stains M, Stetzer MR, Vinson EL, Smith MK (2018) What Types of Instructional Shifts Do Students Experience? Investigating Active Learning in Science, Technology, Engineering, and Math Classes across Key Transition Points from Middle School to the University Level. Front Educ 2:68. https://doi.org/10.3389/FEDUC.2017.00068/BIBTEX
Alessio HM, Messinger JD (2021) Faculty and Student Perceptions of Academic Integrity in Technology-Assisted Learning and Testing. Front Edu 6:629220. https://doi.org/10.3389/feduc.2021.629220
Arbaugh JB (2014) What Might Online Delivery Teach Us About Blended Management Education? Prior Perspectives and Future Directions. J Manag Educ 38(6):784–817. https://doi.org/10.1177/1052562914534244
Arnold IJM (2016) Cheating at online formative tests: Does it pay off? Internet High Educ 29:98–106. https://doi.org/10.1016/j.iheduc.2016.02.001
Ballen CJ, Salehi S, Cotner S (2017) Exams disadvantage women in introductory biology. PLoS One. 12(10):e0186419. https://doi.org/10.1371/journal.pone.0186419
Barrett L (2021) Rejecting Test Surveillance in Higher Education. 2022 Michigan State Law Review 675. https://doi.org/10.2139/ssrn.3871423.
Barrows J, Dunn S, Lloyd CA (2013) Anxiety, Self-Efficacy, and College Exam Grades. Univers J Educ Res 1(3):204–208. https://doi.org/10.13189/ujer.2013.010310
Beatty AE, Esco A, Curtiss ABC, Ballen CJ (2022) Students who prefer face-to-face tests outperform their online peers in organic chemistry. Chem Educ Res Pract 23(2):464–474. https://doi.org/10.1039/D1RP00324K
Beaudoin MF, Kurtz G, Eden S (2009) Experiences and Opinions of E-Learners: What Works, What are the Challenges, and What Competencies Ensure Successful Online Learning. Interdiscip J e-Skills and Lifelong Learning 5:275–289
Bergmans L, Boual N, Luttikhuis M, Rensink A (2021) On the efficacy of online proctoring using proctorio. Proceedings of the 13th International Conference on Computer Supported Education. 1:279–290. https://doi.org/10.5220/0010399602790290
Blasiman RN, Larabee D, Fabry D (2018) Distracted students: A comparison of multiple types of distractions on learning in online lectures. Scholarsh Teach Learn Psychol 4(4):222–230. https://doi.org/10.1037/stl0000122
Butler-Henderson K, Crawford J (2020) A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education 159:104024
Carnegie Foundation for the Advancement of Teaching (2021) The Carnegie Classification of Institutions of Higher Education, Menlo Park, California.
Cassady JC, Johnson RE (2002) Cognitive test anxiety and academic performance. Contemp Educ Psychol 27(2):270–295. https://doi.org/10.1006/ceps.2001.1094
Castro R (2019) Blended learning in higher education: Trends and capabilities. Educ Infor Technol 24(4):2523–2546. https://doi.org/10.1007/s10639-019-09886-3
Chaudhry K, Mann A, Assal H, Chiasson S (2022) “I didn’t even want to turn my head because I was so scared of the prof.”: Student Perceptions of e-Proctoring Software. USENIX Symposium on Usable Privacy and Security (SOUPS). https://www.sce.carleton.ca/faculty/assal/assets/pdfs/Chaudhry2022_posterAbstract_soups.pdf Accessed June 30, 2023.
Correia KM, Bierma SR, Housto SD, Nelson MT, Pannu KS, Tirman CM, Cannon RL, Clance LR, Canterbury DN, Google AN, Morrison BH, Henning JA (2022) Education Racial and Gender Disparities in COVID-19 Worry, Stress, and Food Insecurities across Undergraduate Biology Students at a Southeastern University. J Microbiol Biol Educ 23(1):e00224-e321. https://doi.org/10.1128/JMBE.00224-21
Cotner S, Ballen CJ (2017) Can mixed assessment methods make biology classes more equitable? PLoS One. 12(12):e0189610. https://doi.org/10.1371/journal.pone.0189610
Cotner S, Jeno LM, Walker JD, Jørgensen C, Vandvik V (2020) Gender gaps in the performance of Norwegian biology students: the roles of test anxiety and science confidence. Int J STEM Educ 7:1-10. https://doi.org/10.1186/s40594-020-00252-1
Driessen E, Beatty A, Stokes A, Wood S, Ballen C (2020) Learning principles of evolution during a crisis: An exploratory analysis of student barriers one week and one month into the COVID-19 pandemic. Ecol Evol 10(22):12431–12436. https://doi.org/10.1002/ECE3.6741
Drozdenko R, Tesch F, Coelho D (2012) Learning Styles and Classroom Distractions: A Comparison of Undergraduate and Graduate Students. Am Soc of Bus Behav Sci 19(1):268–277
Eagan K, Hurtado S, Figueroa T, Hughes BE (2014) Examining STEM pathways among students who begin college at four-year institutions. National Academy of Sciences. https://scholarworks.montana.edu/xmlui/bitstream/handle/1/15115/Hughes_NAS_white_2014.pdf?sequence=1. Accessed 8 Mar 2023.
Eaton SE (2020) Academic Integrity During COVID-19: Reflections from the University of Calgary. Int Stud Educ Admin 48(1):80–85
Eaton SE, Turner KL (2020) Exploring academic integrity and mental health during COVID-19: Rapid review. J Contemp Educ Theory Res Doi 10(25656/01):21034
England BJ, Brigati JR, Schussler EE (2017) Student anxiety in introductory biology classrooms: Perceptions about active learning and persistence in the major. PLoS One. 12(8):e0182506. https://doi.org/10.1371/journal.pone.0182506
Fask A, Englander F, Wang Z (2014) Do Online Exams Facilitate Cheating? An Experiment Designed to Separate Possible Cheating from the Effect of the Online Test Taking Environment. J Acad Ethics 12(2):101–112. https://doi.org/10.1007/S10805-014-9207-1/TABLES/3
Feathers T (2021, April 8). Proctorio Is Using Racist Algorithms to Detect Faces. Vice. https://www.vice.com/en/article/g5gxg3/proctorio-is-using-racist-algorithms-to-detect-faces. Accessed 8 Mar 2023
Ferri F, Grifoni P, Guzzo T (2020) Online Learning and Emergency Remote Teaching: Opportunities and Challenges in Emergency Situations. Societies 10(4):86. https://doi.org/10.3390/soc10040086
Gamage KAA, de Silva EK, Gunawardhana N (2020) Online Delivery and Assessment during COVID-19: Safeguarding Academic Integrity. Educ Sci 10(11):301. https://doi.org/10.3390/educsci10110301
Geisinger BN, Raman DR (2013) Why They Leave: Understanding Student Attrition from Engineering Majors. Int J Eng Educ 29(4):914–925
Genereux RL, McLeod BA (1995) Circumstances surrounding cheating: A questionnaire study of college students. Res High Educ 36:687–704. https://doi.org/10.1080/02602938.2022.2144802
Gibbs GR (2018) Analyzing qualitative data (pp. 53-73). Sage
Gilmore J, Maher M, Feldon D (2015) Prevalence, prevention, and pedagogical techniques: academic integrity and ethical professional practice among stem students. Handbook of Academic Integrity. Springer. Singapore. 1–16
Gin LE, Guerrero FA, Brownell SE, Cooper KM (2021) COVID-19 and undergraduates with disabilities: challenges resulting from the rapid transition to online course delivery for students with disabilities in undergraduate stem at large-enrollment institutions. CBE Life Sci Educ. 20(3):ar36. https://doi.org/10.1187/cbe.21-02-0028
Gin LE, Pais D, Cooper KM, Brownell SE (2022) Students with Disabilities in Life Science Undergraduate Research Experiences: Challenges and Opportunities. CBE Life Sci Educ 21(2):ar32. doi:https://doi.org/10.1187/CBE.21-07-0196.
Gonzáles-Gutierrez V, Alvarez-Risco A, Estrada-Merino A, Anderson-Seminario MDLM, Mlodzianowska S, Del-Aguila-Arcentales S, Yáñez JA (2022) Multitasking Behavior and Perceptions of Academic Performance in University Business Students in Mexico during the COVID-19 Pandemic. Int J Ment Health Pr 24(4):565–581. https://doi.org/10.32604/ijmhp.2022.021176
Gregg N (2012) Increasing Access to Learning for the Adult Basic Education Learner with Learning Disabilities. J Learn Disabil 45(1):47–63. https://doi.org/10.1177/2F0022219411426855
Hassel S, Ridout N (2018) An Investigation of First-Year Students’ and Lecturers’ Expectations of University Education. Front Psychol 18:3–21. https://doi.org/10.3389/fpsyg.2017.02218
American Psychological Association. (n.d.) Mental Health. In APA dictionary of psychology. from https://dictionary.apa.org/mental-health Accessed June 30 2023
Hemmler VL, Kenney AW, Langley SD, Callahan CM, Gubbins EJ, Holder S (2020) Beyond a coefficient: an interactive process for achieving inter-rater consistency in qualitative coding. Qual Res 22(2):194–219. https://doi.org/10.1177/1468794120976072
Henderson M, Chung J, Awdry R, Mundy M, Bryant M, Ashford C, Ryan K (2022) Factors associated with online examination cheating. Assessment & Evaluation in Higher Education. 1–15. https://doi.org/10.1007/BF02208251
Hodges C, Moore S, Lockee B, Trust T, Bond A (2020) The Difference Between Emergency Remote Teaching and Online Learning. EDUCAUSE Review. Boulder, CO.
Hollister KK, Berenson ML (2009) Proctored Versus Unproctored Online Exams: Studying the Impact of Exam Environment on Student Performance. Decis Sci 7(1):271–294. https://doi.org/10.1111/j.1540-4609.2008.00220.x
Hutcheon EJ, Wolbring G (2012) Voices of “disabled” post-secondary students: Examining higher education “disability” policy using an ableism lens. J Divers High Edu 5(1):39. https://doi.org/10.1037/A0027002
JMP, Version 15.9 (2021). SAS Institute Inc., Cary, NC, 1989–2021.
Kauffman H (2015) A review of predictive factors of student success in and satisfaction with online learning. Res Learn Technol 23:26507. https://doi.org/10.3402/rlt.v23.26507
Kentnor H (2015) Distance Education and the Evolution of Online Learning in the United States. Curric Teach Dialogue 17(1):21–34
Ketchen Lipson S, Gaddis SM, Heinze J, Beck K, Eisenberg D (2015) Variations in Student Mental Health and Treatment Utilization Across US Colleges and Universities. J Am Coll Health 63(6):388–396. https://doi.org/10.1080/07448481.2015.1040411
Kharbat FF, Abu Daabes AS (2021) E-proctored exams during the COVID-19 pandemic: A close understanding. Education and Information Technologies 26(6):6589–6605
Kolski T, Weible J (2018) Examining the Relationship Between Student Test Anxiety and Webcam Based Exam Proctoring. Online J Distance Learning Adm 21(3):1–15
Krou MR, Fong CJ, Hoff MA (2021) Achievement Motivation and Academic Dishonesty: A Meta-Analytic Investigation. Edu Psychol Rev 33:427–458. https://doi.org/10.1007/s10648-020-09557-7
Kuh GD, Kinzie JL, Buckley JA, Bridges BK, Hayek JC (2006) What matters to student success: A review of the literature, vol 8. National Postsecondary Education Cooperative, Washington, DC
Langenfeld T (2020) Internet-Based Proctored Assessment: Security and Fairness Issues. Educ Meas-Issues Pra 39(3):24–27. https://doi.org/10.1111/emip.12359
Lee A (2011) A Comparison of Postsecondary Science, Technology, Engineering, and Mathematics (STEM) Enrollment for Students With and Without Disabilities. Career Dev except Individ 34(2):72–82. https://doi.org/10.1177/0885728810386591
Lee J, Solomon M, Stead T, Kwon B, Ganti L (2021) Impact of COVID-19 on the mental health of US college students. BMC Psychol 9(1):1–10. https://doi.org/10.1186/S40359-021-00598-3/FIGURES/4
Lisberg A, Woods B (2018) Mentorship, mindset and learning strategies: an integrative approach to increasing underrepresented minority student retention in a STEM undergraduate program. J STEM Educ 19(3)
Lisnyj KT, Pearl DL, McWhirter JE, Papadopoulos A (2021) Exploration of Factors Affecting Post-Secondary Students’ Stress and Academic Success: Application of the Socio-Ecological Model for Health Promotion. Int J Env Res Pub He 18(7):3779
Lowe H, Cook A (2003) Mind the Gap: Are students prepared for higher education? J Furth High Educ 27(1):53–76. https://doi.org/10.1080/03098770305629
Luft JA, Jeong S, Idsardi R, Gardner G (2022) Literature reviews, theoretical frameworks, and conceptual frameworks: An introduction for new biology education researchers. CBE Life Sci Educ. 21(3):rm33
Macfarlane B, Zhang J, Pun A (2014) Academic integrity: a review of the literature. Stud High Educ 39(2):339–358. https://doi.org/10.1080/03075079.2012.709495
May KE, Elder AD (2018) Efficient, helpful, or distracting? A literature review of media multitasking in relation to academic performance. Int J Educ Technol High Educ 15(1):13. https://doi.org/10.1186/s41239-018-0096-z
Meaders CL, Toth ES, Lane AK, Shuman JK, Couch BA, Stains M, Stetzer MR, Vinson E, Smith MK (2019) “What will I experience in my college STEM courses?” An investigation of student predictions about instructional practices in introductory courses. CBE Life Sci Educ. 18(4):ar60. https://doi.org/10.1187/cbe.19-05-0084
Meaders CL, Lane AK, Morozov AI, Shuman JK, Toth ES, Stains M, Stetzer MR, Vinson E, Couch BA, Smith MK (2020) Undergraduate Student Concerns in Introductory STEM Courses: What They Are, How They Change, and What Influences Them. J STEM Educ Res 3(2):195–216. https://doi.org/10.1007/s41979-020-00031-1
Milone AS, Cortese AM, Balestrieri RL, Pittenger AL (2017) The impact of proctored online exams on the educational experience. Curr Pharm Teach Learn 9(1):108–114. https://doi.org/10.1016/j.cptl.2016.08.037
Nigam A, Pasricha R, Singh T, Churi P (2021) A Systematic Review on AI-based Proctoring Systems: Past. Present and Future Educ Inf Technol (dordr) 26(5):6421–6445. https://doi.org/10.1007/s10639-021-10597-x
Passow HJ, Mayhew MJ, Finelli CJ, Harding TS, Carpenter DD (2006) Factors Influencing Engineering Students’ Decisions To Cheat By Type Of Assessment. Res High Educ 47(6):643–684. https://doi.org/10.1007/s11162-006-9010-y
Patil A, Bromwich JE (2020) How It Feels When Software Watches You Take Tests. The New York Times. https://www.nytimes.com/2020/09/29/style/testing-schools-proctorio.html. Accessed 8 Mar 2023
Picciano AG (2006) Online Learning: Implications for Higher Education Pedagogy and Policy. J Throught 41(1):75–94
Prince M, Felder R, Brent R (2020) Active Student Engagement in Online STEM Classes: Approaches and Recommendations. Adv Eng Educ 8(4):1–25
Qualtrics (2021). Provo, UT. 2005–2022. https://www.qualtrics.com
Rasheed RA, Kamsin A, Abdullah NA (2020) Challenges in the online component of blended learning: A systematic review. Comput Educ. 144:103701. https://doi.org/10.1016/j.compedu.2019.103701
Saldaña J (2021) The Coding Manual for Qualitative Researchers. 4th ed. SAGE
Salehi S, Cotner S, Azarin SM, Carlson EE, Driessen M, Ferry VE, Harcombe W, McGaugh S, Wassenberg D, Yonas A, Ballen CJ (2019) Gender Performance Gaps Across Different Assessment Methods and the Underlying Mechanisms: The Case of Incoming Preparation and Test Anxiety. Front Educ 4:107. https://doi.org/10.3389/feduc.2019.00107
Salehi S, Berk SA, Brunelli R, Cotner S, Creech C, Drake AG, Fagbodun S, Hall C, Hebert S, Hewlett J, James AC, Shuster M, st Juliana JR, Stovall DB, Whittington R, Zhong M, Ballen CJ (2021) Context Matters: Social Psychological Factors That Underlie Academic Performance across Seven Institutions. CBE Life Sci Educ. 20(4):68. https://doi.org/10.1187/CBE.21-01-0012
Scherer LA, Leshner AI (2021) Mental health, substance use, and wellbeing in higher education: supporting the whole student
Sommer M, Arendasy ME (2014) Comparing different explanations of the effect of test anxiety on respondents’ test scores. Intelligence 42:115–127. https://doi.org/10.1016/j.intell.2013.11.003
Stemler SE (2004) A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability. Pract Assess Res Eval 9(1):4. https://doi.org/10.7275/96jp-xz07
Stowell J, Bennett D (2010) Effects of Online Testing on Student Exam Performance and Test Anxiety. J Educ Comput Res 42(2):161–171. https://doi.org/10.2190/EC.42.2.b
Stuber-McEwen D, Wiseley PA, Hoggatt S (2009) Point, Click, and Cheat: Frequency and Type of Academic Dishonesty in the Virtual Classroom. Online Journal of Distance Learning Administration 12(3):1–10
Thames AD, Panos SE, Arentoft A, Byrd DA, Hinkin CH, Arbid N (2015) Mild test anxiety influences neurocognitive performance among African Americans and European Americans: identifying interfering and facilitating sources. Cultur Divers Ethnic Minor Psychol 21(1):105–113. https://doi.org/10.1037/a0037530
Thompson SK, Hebert S, Berk S, Brunelli R, Creech C, Drake AG, Fagbodun S, Garcia-Ojeda ME, Hall C, Harshman J, Lamb T, Robnett R, Shuster M, Cotner S, Ballen CJ (2020) A call for data-driven networks to address equity in the context of undergraduate biology. CBE Life Sci Educ 19(4):1–12. https://doi.org/10.1187/cbe.20-05-0085
von der Embse N, Jester D, Roy D, Post J (2018) Test anxiety effects, predictors, and correlates: A 30-year meta-analytic review. J Affect Discord 227:483–493. https://doi.org/10.1016/j.jad.2017.11.048
Watkins J, Mazur E (2013) Retaining students in science, technology, engineering, and mathematics (STEM) majors. J Coll Sci Teach 42:36–41
Watson GR, Sottile J. Cheating in the digital age: do students cheat more in online courses?. Online J Distance Learning Adm. 2010;13(1).
Weiner JA, Hurtz GM (2017) A comparative study of online remote proctored versus onsite proctored high-stakes exams. J Appl Test Technol 18(1):13–20
Whitaker Sena JD, Lowe PA, Lee SW (2007) Significant predictors of test anxiety among students with and without learning disabilities. J Learn Disabil 40(4):360–376. https://doi.org/10.1177/2F00222194070400040601
Wilkinson J (2009) Staff and student perceptions of plagiarism and cheating. Int J Teach Learn High Educ 20(2):98–105
Woldeab D, Brothen T (2019) 21st Century Assessment: Online Proctoring, Test Anxiety, and Student Performance. Int J E-Learning & Distance Educ 34(1):1–10
Woods K, Parkinson G, Lewis S (2010) Investigating access to educational assessment for students with disabilities. Sch Psychol Int 31(1):21–41. https://doi.org/10.1177/2F0143034310341622
Wu JY (2015) University students’ Motivated Attention and use of regulation strategies on social media. Comput Educ 89:75–90. https://doi.org/10.1016/j.compedu.2015.08.016
Wu JY, Cheng T (2019) Who is better adapted to learning online within the personal learning environment? Relating gender differences in cognitive attention networks to digital distraction. Comput Educ 128:312–329. https://doi.org/10.1016/j.compedu.2018.08.016
Acknowledgements
We thank our participants, both students and instructors, for their time and perspectives as well as the EDU-STEM Network for funding support and community.
Funding
This work was supported by the EDU-STEM Network (NSF Award #1919462). Funding supported publication fees and undergraduate student support.
Author information
Authors and Affiliations
Contributions
AKL and AP led data collection, analysis, and writing of the manuscript. All authors contributed to project design including development of research questions and surveys; and manuscript editing. CJB, EPD, JAH, BG and CW also contributed to data collection and management.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Additional file 1:
Appendix A. Survey Content. Appendix B. Supplemental Figures & Tables. Appendix C. Additional Data: Remote Proctoring Concerns. Appendix D. Codebooks. Appendix E. Perceptions of Cheating by Institution.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
About this article
Cite this article
Pokorny, A., Ballen, C.J., Drake, A.G. et al. “Out of my control”: science undergraduates report mental health concerns and inconsistent conditions when using remote proctoring software. Int J Educ Integr 19, 22 (2023). https://doi.org/10.1007/s40979-023-00141-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40979-023-00141-4