Vol. 36, No. 1, 2021
Abstract: Recent media stories have reported that online webcam-based exam proctoring have wrongly flagged students for cheating, causing tremendous anxiety and frustration, and thus disadvantaging students. This study assesses if online webcam-based exam proctoring in the age of COVID-19 disadvantages students (particularly those who are non-white and with different ethnic and socio-economic status), and whether worry about being wrongly flagged for cheating may affect students’ exam performance. This survey-based study was conducted using 237 undergraduate students enrolled in a public land-grant research university in the upper Midwest region of the United States, who took their exams through Proctorio. Our study supports – as is widely reported by the media –that students are experiencing anxiety and fear of being wrongly flagged during online proctoring. However, we show that students’ anxiety about online proctoring is associated with their general level of anxiety; this correlation to “trait” anxiety supports our previous study. We further find that worry over being wrongly flagged did not directly impede students’ exam performance. We discuss how students and faculty alike face challenges, especially those who had not used online webcam exam proctoring prior to COVID-19 stay-at-home directives. For faculty, it is not only having to adapt to an unfamiliar teaching environment that requires new technologies, but also being expected to utilize webcam-based online proctoring for high stakes exams. An in-depth look is needed into the kind of support students and faculty need using online proctoring into the future. Furthermore, the academic world in general, and US colleges and universities in particular, should initiate a conversation on how best to regulate this industry so that students and institutions are well served.
Keywords: online, exam, proctoring, anxiety, students, performance
Résumé: Des histoires récemment publiées par les médias ont rapporté que les surveillances d'examens en ligne par webcams ont fait croire à tort que des étudiants avaient triché, ce qui a provoqué une anxiété et une frustration énormes pénalisant ainsi ces derniers. Cette étude évalue si la surveillance d'examens en ligne par webcam, à l'ère du COVID-19, désavantage les étudiants (en particulier ceux qui ne sont pas blancs et ceux qui ont un statut ethnique et socio-économique différent), et si l'inquiétude d'être injustement signalé pour tricherie peut affecter les performances des étudiants aux examens. Cette étude repose sur une enquête menée auprès de 237 étudiants de premier cycle inscrits dans une université publique de recherche dans la région supérieure du Midwest des États-Unis, qui ont passé leurs examens par le biais de Proctorio. Notre étude confirme - comme l'ont largement rapporté les médias - que les étudiants éprouvent de l'anxiété et de la crainte d'être signalés à tort lors de la surveillance en ligne. Cependant, nous montrons que l'anxiété des étudiants concernant le contrôle en ligne est associée à leur niveau général d'anxiété ; cette corrélation avec l'anxiété comme "caractéristique" confirme notre étude précédente. Nous constatons également que l'inquiétude liée au fait d'être signalé à tort ne nuit pas directement à la performance des étudiants aux examens. Nous discutons de la façon dont les étudiants et les professeurs font face à des défis, en particulier ceux qui n'avaient pas utilisé la surveillance d'examen en ligne par webcam avant les directives de rester à la maison. Pour les professeurs, il s'agit non seulement de s'adapter à un environnement d'enseignement inconnu qui nécessite de nouvelles technologies, mais aussi de devoir utiliser la surveillance en ligne par webcam pour des examens à forts enjeux. Il est nécessaire d'examiner en profondeur le type de soutien dont les étudiants et les professeurs ont besoin pour utiliser la surveillance en ligne à l'avenir. En outre, le monde universitaire en général, et les collèges et universités américains en particulier, devraient entamer une conversation sur la meilleure façon de réglementer cette industrie afin que les étudiants et les institutions soient bien desservis.
Mots-clés: en ligne, examen, surveillance, anxiété, étudiants, performance
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
Online learning has been growing steadily for many decades. However, in the year 2020 the COVID-19 pandemic transformed online education to be nearly the only educational venue – particularly for higher education. Even prior to 2020, online learning was an “integral part of contemporary education in the United States” (Woldeab & Brothen, 2019, p. 2). The most recent (2017) data available from the U.S. Department of Education indicates that almost 9.8 million students (or 48.4%) were enrolled either exclusively in an online program or taking some of their courses online. Out of the total 20.1 million students enrolled, over 17.1 million were undergraduate students and of these, 4.5 million (or 26%) were enrolled exclusively in online programs or took some of their courses online (Ginder, et al., 2019).
Indeed, online education in the U.S. was expected to continue growing in the year 2018 and 2019. In fact, Lederman (2019) stated that from 2016 to the year 2018 online offerings grew on average by 2% annually. However, following the start of the COVID-19 pandemic and its stay-at-home directives, higher education institutions across the country closed their doors and — overnight — moved completely online. This was regardless of students’ preferences, access to technology or internet, or skills needed to be successful in the online teaching and learning environment. Just like office work structures are likely to be permanently altered due to this pandemic, this rapid reliance on online education will have long-lasting impacts.
To augment the synchronous components of teaching, higher education institutions turned to video conferencing platforms, sending the growth of companies such as Zoom and Microsoft Teams through the roof. To put this into perspective, Zoom CEO Eric S. Yuan shared in his April 1, 2020 blog post that Zoom went from 10 million meeting participants at the end of 2019 to 200 million users by March 2020. And this is only one of many video conferencing platforms higher education institutions are using for live online classroom meetings. Indeed, migration into an online environment because of the COVID-19 pandemic presented greater challenges to some educational programs. In the medical education field, for instance, interpersonal interaction and collaboration is a central curriculum approach; this field has turned to online video conference platforms to “increase a sense of connectedness among medical students” (Anderi, et al., 2020, p. 1). And of course, the challenges reach much further than the medical field: a whole series of practical, hands-on, technical, experiential, or trades-based fields have struggled moving programs online or have attempted to build hybrid models from scratch. For example, in her March 2021 report in BestColleges, Anne Dennon highlights the shortcomings of trying to learn certain trades by theory only, and that many programs have had to postpone hands-on parts of their training until campuses reopen (Dennon, 2021).
The debate over the quality of online education compared to that of in-person instruction has been explored extensively — see for example a recent systematic meta-analytic review in Woldeab et al. (2020), which found no significant difference in the quality of education between the two delivery methods. It is also well-studied that, in addition to technological skills, the traits of self-directedness, motivation, and time management are also key in online learning. Students who do not have these characteristics struggle with online classes.
Most recently, we have seen several studies citing student dissatisfaction with online courses in the age of COVID-19, though this finding may be the result of students taking online courses because they do not have a choice. For example, Means and Neisler (2020) concluded that, in general, undergraduate students had trouble staying on task and remaining motivated; their study was a random sample of 1000 students who moved from the physical classroom to a completely online environment during the Spring 2020 semester. However, the above authors also noted that research participants did not attribute their challenges to the quality of education they received. The research of Alawamleh, et al., (2020) concluded that students prefer in-person classes in part due to a lack of motivation and to a “feeling of isolation” that can result from the online teaching environment.
Even before 2020, it was natural to expect that the factors leading to the steady growth of online learning would also lead to a similar growth in online proctoring. In fact, alongside online leaning growth, webcam-based online proctoring has also been on the rise (see our previous work: Woldeab & Brothen, 2019; Woldeab et al., 2017). However, after the COVID-19 pandemic started and higher education institutions closed in-person offerings, those same institutions had no choice but to rely increasingly on webcam proctoring, especially for high-stakes exams. Online proctoring companies filled the void and their use by higher education institutions soared during the pandemic. For example, Chin (2020) stated that over 500 higher education institutions in the U.S. are using Examity (https://www.examity.com/"), one of the fastest-growing online webcam-proctoring services. In the institution where the present study was conducted, unique users of Proctorio (https://proctorio.com/) increased by 42% in 2019–20, compared to the previous academic year. That increase was directly due to the pandemic changes that were mandated mid-semester in 2020.
What makes this use of online exams different, therefore, is that students who do not feel confident with the technologies involved, nor comfortable with the exam environment itself, are left with no other option. In our 2017 comprehensive study consisting of 865 undergraduate students conducted over two consecutive semesters (504 of which completed both the pre- and post-surveys) some 52% reported lacking the necessary skills such as the “expertise to set up, use, and/or navigate any technological aspects of the exam environment” (Woldeab, et al., 2017, p. 151). Most recently, Chin (2020) reported that research participants who took their exams through the Examity web-based proctoring service described their experiences as uncomfortable and intrusive. This experience can be stressful to exam takers.
One of the main media pieces that highlighted these problems is Drew Harwell’s 2020 article in The Washington Post, “Cheating-Detection Companies Made Millions During the Pandemic. Now Students are Fighting Back.” The report highlighted student anxiety and frustrations, and drew attention to the fact that students of different ethnicity and socio-economic status may also be disadvantaged by online proctoring (Harwell, 2020).
Harwell’s report was based on interviews with college students who have experienced online proctoring in the US. The article highlighted the following:
According to Morgan (2020) the online exam proctoring environment is immensely challenging for end users to understand and navigate. There are many vendors and they provide multiple levels of service; these different services are often not clearly understood by users, and many companies do not provide information about their products and processes on their websites. For example, “some vendors offer products that exist in more than one segment of the market, …a vendor can offer live proctoring and some sort of automated solution under the same label, … which creates an overlap in their market segments” (p. 4).
Although the online exam proctoring environment consists of multiple levels of service, most feature “passive video surveillance,” which uses artificial intelligence and biometrics or live proctoring, or both (Morgan, 2020). In our case, research participants took their exams through Proctorio, which uses passive video surveillance (or record-and-review). Among others, the platform “uses gaze-detection, face-detection and computer-monitoring software to flag students for any ‘abnormal’ head movement, mouse movement, eye wandering, computer window resizing, tab opening, scrolling, clicking, typing, and copies and pastes” (Harwell, 2020, para. 6).
Similar to online education overall, the online exam proctoring industry has been growing rapidly. However, what is lacking from this fast growth is our understanding of how well it meets the needs of institutions as well as end users. Therefore, the purpose of the current study is to understand if online webcam-based proctoring disadvantages students as it is reported in the popular media by — for example, wrongly flagging exam takers for cheating — and to understand how this may affect their actual performance.
The overarching research question guiding the present study is: does online exam proctoring disadvantage students? To address the overarching question, we consider these four secondary questions:
For a comprehensive review of the literature on online webcam proctoring, exam anxiety and students’ performance, as well the broader literature on trait test anxiety and students’ preference, we recommend that readers review our previous works on this topic: “21st Century Assessment: Online Proctoring, Test Anxiety and Student Performance” (2019), as well as “Under the watchful eye of online proctoring” (2017). Regardless of whether the testing environment is online or in a face-to-face classroom, test taking can be stressful — for some more than others. In our 2017 study exploring students’ experience with online webcam-based proctoring, we found that such exams can induce anxiety, especially for students who took online webcam-proctored exams for the first time (Woldeab, et al., 2017). This led us to look further into the topic, mainly focusing on online proctoring exam anxiety and students’ performance. We also assessed if students who exhibit high test anxiety also report difficulties with online webcam-based proctoring, and whether online webcam-based proctoring (in that case, ProctorU) induces higher levels of test anxiety resulting in lower student performance.
This 2017 study was conducted with 631 undergraduate students who attended a public land-grant research university in the upper Midwest region of the United States. While 44 out of the 631 students who participated in the study completed their final exam via ProctorU and served as the experimental group, the remaining 587 served as the control group and completed their final exam in a testing center with live proctors. Pre-and post-surveys were developed by the researchers to assess exam takers’ experiences with online web-based proctoring, while the Westside Test Anxiety Scale, developed by Driscoll (2007), was deployed to assess participants’ trait test anxiety — “the tendency to be anxious in any evaluative situation” (Hong & Karstensson, 2002, p. 349).
Among others, our findings revealed “that the greater relationship for trait test anxiety and poorer final exam performance among the ProctorU students was mostly restricted to those with high anxiety scores” (Woldeab & Brothen, 2019, p. 7). In short, among students who took their final exam via ProctorU, we found that the relationship between trait test anxiety and poor final exam performance was primarily restricted to those who scored high on the Westside anxiety scale.
To address the research questions considered in this study, we examine the survey responses of 237 undergraduate students enrolled in a public land-grant research university in the upper Midwest region of the United States. The data used in this study was collected during the Fall 2020 semester, and the students who completed the survey gave their consent for the data to be used in the study. All those who participated in this research were enrolled in a large, online, introductory course, and took their exams individually via Proctorio, a webcam-based online proctoring service (https://proctorio.com/). The class was based on the Personalized System of Instruction (Kulik, et al., 1990) plan in which students read their textbook assisted by a study guide and took a series of three mastery quizzes for each chapter. At three points during the semester, students took a short (20 multiple choice questions) midterm exam administered by Proctorio. Students could take a practice final exam as many times as they liked, to gauge their level of preparation. The data collection took place toward the end of the fall semester just prior to students’ final examination period. Unlike our previous research participants on this topic — who had a choice between taking their exams in a computerized testing center or through ProctorU — the only exam venue available to research participations in the present study was Proctorio. This was, in part, due to the COVID-19 pandemic and stay-at-home directives. Three hundred thirty-two students finished the class by taking the final exam, also administered by Proctorio.
We provided research participants with a consent form and their participation in this study was completely voluntary. Among other things, the consent form clearly detailed the purpose of the study and the risks and benefits of taking part in the study. The survey was conducted through Qualtrics® Core XM™ (https://www.qualtrics.com/"), an online survey tool that includes a mobile interface. The consent form was the first page of the survey and survey takers were required to select either "I have read the above consent form and asked any questions I may have. I consent to participate," or "I do not consent to participate (you will be exited from the survey)." Therefore, only those students who consented to participate were able to proceed with the survey. This served as their electronic signature in participating in the study. To encourage and ensure adequate participation in the study, the faculty teaching the courses considered in this study agreed to award two points to students who completed it.
First, to assess students’ online webcam-based proctoring experience, research participants completed a 9-item survey. This survey, among others, asked if students have been wrongly flagged — or heard of others being wrongly flagged — for cheating on online-proctored exams, and how this may have affected their performance on subsequent online exams.
Second, to assess research participants’ trait anxiety, we deployed the Westside Test Anxiety Scale, developed by Driscoll (2007). The Westside Scale consists of 10 items and is meant to assess students with anxiety impairment on a five-point scale ranging from “5 - extremely or always true” to “1 - not at all or never true.” Six items in this questionnaire address “performance impairments related to cognitive symptoms of anxiety, i.e., lack of attentiveness, poor memory, or worry” (Woldeab & Brothen, 2019, p. 5), while the other four items assess worry and dread. However, the above scale contains no items with regard to physiological over-arousal. The scale “thus has high face validity, in that it includes the highly relevant cognitive and impairment factors but omits the marginally relevant over-arousal factor” (Driscoll, 2007, p. 2). Overall, the scale is “a reliable and valid measure of test-anxiety impairment” (Driscoll, 2007, p. 4).
Two hundred thirty-seven students completed both the questionnaire and the class. They constituted 71.39% of students who finished the course by taking the final exam. These students performed better on all exams and non-exam assignments, scoring 217.79 vs 194.38 total points (of 255 possible; t = 6.98, p < .001). The fact that these students had higher scores wasn’t a surprise because in our experience, higher scoring students tend to volunteer for ways to earn extra points at a higher rate than low scoring students. Their mean score on the Westside Test Anxiety Scale was 33.32 (sd = 8.06), which is in the moderate range for anxiety (Driscoll, 2007). Also, the majority of these students answered “agree” or “strongly agree” to our questionnaire items measuring anxiety (e.g., “proctored exams more stressful,” “I get more anxious” [on online proctored exams]).
These students’ Westside Scale scores correlated positively with our anxiety items but not with the items measuring whether they worried about being wrongly flagged for cheating. Suspecting that we had tapped two somewhat independent student concerns, we factor-analyzed their questionnaire responses. After Varimax rotation, two factors emerged, both from the Westside Scale. We termed Factor 1 “Anxiety” (e.g., “I feel out of sorts or not really myself when I take important exams”), which accounted for 37.80% of the variance and Factor 2 “Worry” (e.g., “I worry so much before a major exam that I am too worn out to do my best on the exam”), which accounted for 23.45%. The Westside Scale correlated significantly with Factor 1 (r =.356, p < .001) as well as Factor 2 (r =.139, p =.032) indicating that all of our questionnaire items tapped test anxiety but the Factor 2 items likely less so. Consistent with this interpretation and our earlier results reported in Woldeab & Brothen (2019), Factor 1 correlated significantly negative with all three mid-semester exams and the final exam (r’s ranged from -.148 to -.238) but Factor 2 did not (r’s ranged from -.005 to -.008). Neither factor correlated significantly with points earned on non-exam course elements (exercises and writing assignments; .017 & .059). Therefore, our data indicated that students’ anxiety was negatively related to their online proctored exam scores but their worry about being flagged was not. This is also consistent with our previous study findings which revealed that high trait anxiety was associated with lower exam scores, and was particularly true for students “with high test anxiety taking exams in an online proctored setting” (Woldeab & Brothen, 2019, p.1). If, as this suggests, anxiety is related to low exam performance, is online exam proctoring the likely cause of them both? A simple analysis suggests not. Assuming that the Westside is more a measure of trait anxiety and our Factor 1 is more related to students’ immediate concern about online proctoring, we computed a partial correlation. We removed the Westside’s effects on the correlation between Factor 1 and Final exam score. The resulting relationship was not significant (r = -.071). To follow this up, we conducted two additional analyses.
We asked an open-ended question on the anonymous final course evaluation to assess students’ concerns: “Please tell me about your experience with the Final Exam delivered through Proctorio. Did things go OK for you? Did you have any problems?” Of the 332 students finishing the class, 155 completed the evaluation and 22 (6.6%) indicated on this item that they felt anxious about or worried about the exam. The rest left it blank, responded basically “OK” or noted they should have studied more, etc. We classified those 22 responses as directly relating to Proctorio, consistent with our Anxiety factor, or consistent with our Worry factor. Four students mentioned Proctorio but didn’t relate it directly to anxiety or worry about being wrongly flagged (“disrupts my performance; Don’t like using Proctorio much tho, feels invasive; I am uncomfortable with Proctorio; makes me a little nervous because I’ve had lots of technology problems with it before”). Eleven students mentioned anxiety as a problem in their answers (“makes me nervous; I was quite stressed; I have terrible test anxiety; I do not like taking exams online—gives way more stress and test anxiety; I was way too anxious [about doing] something off, Proctorio works fine but it definitely adds to the exam anxiety; Proctorio can be even more difficult than in-person exams for those with persistent anxiety; I have diagnosed GAD and Proctorio really stresses me out; it made my test anxiety really bad because I felt like someone was always looking over my shoulder; I just get a lot of testing anxiety and the Proctorio set up just makes it worse; the recording always increased my test anxiety”).
Finally, seven students mentioned worry as a problem in their answers (“I have heard of many people being falsely accused of cheating; I'm worried about Proctorio flagging me; I had no problems but I was worried that I would; I've heard critiques about the security and standards of Proctorio but I've not run into any issues myself; taking a test with a camera watching me…I feel like looking off to the distance is a bad thing and could be seen as cheating; I'm a little worried my exam will get flagged, but I think it's clear in the video that I just closed the [automatically popped up] email tab as soon as it opened; I have had instances in the past and so have peers where it flagged us when we did nothing wrong”). Although consistent with our earlier findings on anxiety (Woldeab & Brothen, 2019), these 22 responses are not indicative of strong or widespread student concerns about the online exam monitoring they experienced.
The above results suggest that anxiety was related to exam performance but relatively few students reported on the course evaluation that it was a problem. Because of this, we suspected there may be other factors mediating this relationship, and with a larger sample size compared to that used in Woldeab & Brothen (2019) we were able to test this. Accordingly, we did an analysis to assess whether the anxiety we tapped into with our questionnaire (specifically Factor 1) and the Westside Scale had a direct negative effect on student performance. We performed a stepwise linear regression analysis with final exam performance as the dependent variable. For independent variables, we used several measures.
First, to assess the effects of anxiety, we included both our factors of anxiety and worry as well as the Westside Anxiety Scale score. Second, to assess how well-prepared students were for the final exam, we included a non-exam point total—the sum of all quiz and writing scores — which helped students master each individual chapter. Second, although mid-semester and final exams are typically positively correlated, we assumed that the three mid-semester exams would help students adapt to online proctoring in this specific class; correlations between this total and each individual mid-semester exam and the other variables above were highly similar. We further assumed that with our other measures of preparation controlled for, higher scores on these exams would indicate students’ ease in taking them. Accordingly, we entered a total of students’ points earned on them. We also entered the number of practice final attempts. Third, to control for academic ability, we entered the Composite ACT score and students’ cumulative grade point average obtained from the university records office.
The resulting model (R = .800) accounted for 63.3% of the variance. The first variable in the equation was mid-semester total (R Square = .506). The second was number of practice final attempts (adding R Square = .119). The third was non-exam points (adding R Square = .012). Both were significant with p < .05. All three of our anxiety measures were excluded from the regression equation, suggesting that anxiety or worry about being flagged did not directly affect final exam performance materially.
To address concerns that non-white, non-upper-middle class students are treated unfairly by online proctoring, we requested two items of routinely collected information from the University Records office for students in this study. The first is whether they are First Generation (FGen) college students (Pascarella, et al., 2004) and the second is whether they are Pell Grant eligible (Baum, 2015). These factors are similarly predictive of SES but we had to infer the connection to our research participants on a group basis for two reasons: one, as University policy considers Pell status to be sensitive information, we were only able to obtain group data; and two, we felt it too intrusive to ask students this information directly through our questionnaire. Therefore, this should be seen as a tentative finding that needs further confirmation through future studies.
As a baseline, the University classified 19.69% of all degree-seeking undergraduate students on campus during the period of this study as FGen. Of all these FGen students, 45.04% were Pell eligible, whereas 11.90% of non-FGen were eligible. For the Pell status of students participating in this study, the records office provided summary data for each American ethnic group (the data is not relevant for International students) as follows: Asian = 35.71%, Black = 87.88%, Hispanic = 43.75%, Multi-ethnic = 30.77%, and White = 18.52%. Clearly, our non-white students, overall, were lower in SES. Individually, each student in our study was classified as a FGen college student or not, resulting in 29% of students being so identified. We combined the non-white groups into one and coded two groups (non-white/white) into an ethnic status (Ethnic) variable. Although that variable correlated some with our FGen variable (r = .108, p = .049), we assert that they have independent predictive validity. Given the group percentages indicated above, we felt confident that our FGen variable was a good indicator of SES. We were thus able to analyze the effects of Proctorio on key variables of interest in this study to determine whether these students achieved differently in the course and whether they were treated differently by Proctorio.
First, based on our identification of relevant variables in our first regression analysis, we analyzed the association of Ethnic and FGen with several key variables. Three measured academic ability and course performance: ACT Comp, final exam score, and course grade. Two measured preparation for the final exam: non-exam points and number of practice final attempts. Three measured anxiety or concern about online proctoring: Westside score, anxiety, and flag worry from our questionnaire. Two variables were measured by Proctorio as students were taking their exam. No differences on any of these variables reached statistical significance except four—three related to ethnic status and one related to FGen status. Ethnic status correlated slightly with course grade (r = .114, p = .036) and final exam score (r = .160, p = .006), and higher with ACT Comp score (r = .200, p < .001)—with non-white lower on each. However, these relationships disappeared when we computed partial correlations controlling for ACT Comp score between ethnic status and course grade (r = .074) and non-exam points (r = -.111). FGen status correlated with ACT Comp (r = -.146, p = .014) but it correlated neither with course grade (r = -.066, ns) nor non-exam points (r = -.055, ns). Therefore, academic ability as measured by ACT Comprehensive exam mediated the effects of Ethnic on these variables and FGen was not related to either of them. We thus found no direct relationships between Ethnic and FGen and our key variables.
Second, we performed a stepwise multiple regression analysis for the entire class with final exam performance as the dependent variable. For independent variables, we used several measures as we did with our first regression analysis. To assess how well-prepared students were for the final exam, we included a non-exam point total—the sum of all quiz and writing scores, which helped students master each individual chapter. Then, we entered a total of students’ points earned on the three mid-semester exams. We also entered the number of practice final attempts. To control for academic ability, we entered the Composite ACT score and students’ cumulative grade point average. Finally, we entered the Ethnic and FGen status variables. We did not enter our anxiety measures as 37% of all students had not completed them. The resulting model (R = .802) was similar to the above regression analysis and accounted for 63.9% of the variance. The first variable in the equation was mid-semester total (R Square = .53). The second was number of practice final attempts (adding R Square = .103). The third was cumulative GPA (adding R Square = .011). All were significant with p < .01. Thus, adding Ethnic or FGen to our earlier regression model did not change it and indicates there was no effect of these two variables. Third, to explore whether Proctorio discriminated between white and non-white or low SES students, we compared the two variables Proctorio reports upon students finishing their exams (https://proctorio.zendesk.com/hc/en-us/articles/206924657-Exam-Analytics-Overview). The first is an abnormalities score, which results from a summation of times the student did things differently from the peer test-taker average. The second, the suspicion score, involves “a quick calculation based on the aggregation of frames during the exam which were deemed suspicious in combination with the exam analytics”. If the suspicion level shows a large percentage for a student, then Proctorio suggests the exam should be considered for further review. Non-white students scored slightly higher and FGen students scored slightly lower on both. However, none of these differences reached statistical significance (all p values > .256). Thus, we again found no direct evidence of students being treated differently.
Overall, the chief aim of this study was to assess if online webcam-based exam proctoring disadvantages students (particularly non-white and low SES students), and whether worry about being wrongly flagged for cheating may affect students’ exam performances. As the findings of this study show, students’ anxiety about online proctoring was associated with their general level of anxiety; further, the results clearly show that anxiety or worry over being wrongly flagged did not directly impede students’ exam performance. Of course, because the participants in this study all experienced online proctoring, we do not have an experimental test of online proctoring’s effects although we do have a test of the reality as it has been for students during the pandemic year. As was the case with our previous study, students’ lower exam performance is related to “trait” anxiety. This means that our study supports — as is widely reported by the media — the notion that students are experiencing anxiety and the fear of being wrongly flagged during online proctoring. But whereas it may be that online exam proctoring is a cause of student anxiety, our data indicates there was no direct effect of this anxiety on exam performance for the students in this study.
What should also be noted is that we collected no data on how much experience with online proctoring our research participants had in past courses. Brothen and Peterson (2012) looked into online exam cheating and suggested that those students who did better on their exams could have benefited from practice by engaging with the exam environment more than once. Therefore, from our current study (similar to other studies on this issue) in which students had already experienced three short online proctored mid-semester exams, we believe that giving students a number of online proctored low stakes exams may help to build student confidence and lessen their exam anxiety with online webcam-based proctored exams. This also suggests a good practice that instructors using online proctoring could follow.
There is only spotty research in this area, especially since online proctoring is such a recent phenomenon. Therefore, we conclude by suggesting directions for future challenges to be met, and for further studies needed to develop a better and deeper understanding of these topics. When it comes to assessing students’ work, there were two competing thoughts before the COVID-19 pandemic: one that advocated strict adherence to academic integrity through proctored exams, ensuring and safeguarding the integrity of exams and the exam taking process; the other view advocated for more open but comprehensive assessments of students’ work. This conversation has been recently amplified, especially in the media, following the COVID-19 pandemic and the massive migration of education to the virtual space. For example, the report by Redden (2021) asserted that the demand for online assistance with student schoolwork has grown substantially during the COVID-19 pandemic. Redden pointed out that requests on the website Chegg, which claims to help students with homework questions, grew by 196% from April to August 2020. On the other hand, reports such as those of Kadakia and Bradshow (2020) appear to advocate for more equitable exams based on open textbook, not just during COVID-19 pandemic but beyond.
The current challenge that online proctoring presents to higher education institutions, faculty, and students in particular, is mainly due to the swift migration to a completely online environment. This may have been less challenging for some, due to prior investment into online learning and assessment methodologies. However, online learning was not the first option for a good number of faculty and students. The same is even more-so true for online exam proctoring. Students and faculty who had not used online webcam exam proctoring prior to stay-at-home directives faced these challenges. According to Fox, et al. (2020), a national survey consisting of 3,623 faculty at four-year universities throughout the US reported that in the midst of COVID-19, 38% reported difficultly adjusting instructional practices to teach online; and 29% said one of their main challenges was “administering secure tests and exams” (p. 8).
As this study shows, being remotely monitored by webcam appears to be a source of anxiety for some students. For faculty, it is not only having to adapt to an unfamiliar teaching environment that requires new technologies, but also being expected to utilize webcam-based online proctoring for high stakes exams. This includes the additional requirement of relying on recorded video to pass judgement if academic misconduct is reported. Faculty should also be aware that when they are passing judgment on potential academic misconduct, this would typically be based on a one-size-fits-all algorithm-based proctoring software, which might not take into account culturally specific characteristics or behaviours. While this may be business as usual to some faculty who have utilized online webcam-based exam proctoring prior to the COVID-19 pandemic, it can be especially problematic for faculty who do not have prior knowledge and expertise with this technology.
Given the many features offered by exam proctoring vendors, it can be overwhelming to choose the appropriate proctoring level. In addition, “a vendor can offer live proctoring and some sort of automated solution under the same label, but at a different price point” (Morgan, 2020, p. 4). However, faculty should be clear about what level of monitoring is appropriate for high versus low stakes exams and choose according to those needs. In fact, given the growth speed of the industry, educational institutions need to stay current and ensure institutional readiness, and build capacity to support faculty and students alike. Therefore, while building familiarity and confidence for students by giving them several webcam proctored exams throughout the semester, faculty should also consider the total points allocated to such exams. It is good to remember that this technology can disadvantage students who have exam anxiety, especially if major assignments such as mid-term and final exams are all using online webcam-based proctoring.
In her 2020 report prepared for the global research and advisory firm Gartner, Glenda Morgan reviewed 38 online exam proctoring vendors, 25 of which are based in the United States. However, Morgan asserts that the list provided does not reflect a complete list. In fact, in a non-exhaustive search we found over 60 vendors who are providing online exam proctoring, as of February 2021. The main drive behind this fast growth is capital. An industry that valued little more than $350 million in 2019 (Research & Markets, 2020), grew substantially almost overnight; by 2027 the industry is expected to become a10-billion-dollar market globally (LearningLight.com, 2019).
If this industry is not regulated, its rapid expansion and the amount of capital involved may cause serious challenges. In fact, many of these companies are not sharing relevant information about their products and services with end users (Morgan, 2020), and much of the capital flowing to this industry is “public money, from thousands of colleges” (Harwell, 2020, para. 5). Therefore, the academic world in general, and US colleges and universities in particular, should initiate a conversation on how best to regulate this industry so that students and institutions are well served; future research looking into this aspect is necessary.
Further, this study was conducted with students who took their exam through Proctorio, so given the crowded market of online proctoring, and the technological know-how required both by students and faculty to be successful, end user experiences with online exam proctoring vendors will be varied. In short, no two people will have the same experience and future studies should investigate the various passive and active video surveillance environments to better understand the services they offer, and better inform end users and policy makers regarding potential regulations.
Lastly, future studies should explore the algorithm-based proctoring software that underpins the online proctoring environment through different cultural lenses. Some of these online webcam-based proctoring companies are outsourcing their live exam proctoring around the globe, which leaves the door open to potentially misinterpreting cues due to a specific cultural lens, and therefore disadvantaging students. These are some aspects of online webcam-based proctoring that warrant further examination, and which we are considering. Certainly, more empirical data is needed on these and related topics, beyond soundbites and popular media reports, in order to gain a broader and deeper understanding of the challenges and opportunities in online exam proctoring.
Alawamleh, M., Al-Twait, L. M., & Al-Saht, G. R. (2020, August 24). The effect of online learning on communication between instructors and students during Covid-19 pandemic. Asian Education and Development Studies. https://doi.org/10.1108/AEDS-06-2020-0131
Anderi, E., Sherman, L., Saymuah, S., Ayers, E., & Kromrei, H. T. (2020, August 6). Learning communities engage medical students: A COVID-19 virtual conversation series. Cureus, 12(8), Article Number: e9593. https://doi.org/10.7759/cureus.9593
Baum, S. (2015). The Federal Pell Grant and reauthorization of the Higher Education Act. Journal of Student Financial Aid, 45(3), Article 4, 22–34. https://ir.library.louisville.edu/jsfa/vol45/iss3/4/
Brothen, T., & Peterson, G. (2012, February). Online exam cheating: A natural experiment. International Journal of Instructional Technology and Distance Learning, 9(2), 15–20. http://www.itdl.org/Journal/Feb_12/Feb_12.pdf
Chin, M. (2020, April 29). Exam anxiety: How remote test-proctoring is creeping students out. The Verge. https://www.theverge.com/2020/4/29/21232777/examity-remote-test-proctoring-online-class-education
Dennon, A. (2021, March 18). How trade schools are weathering COVID-19. BestColleges. https://www.bestcolleges.com/blog/how-trade-schools-are-adapting-to-covid-19/"
Driscoll, R. (2007, March 1). Westside Test Anxiety Scale Validation. Online Submission. ERIC. https://eric.ed.gov/?id=ED495968
Fox, K., Bryant, G., Lin, N., & Srinivasa, N. (2020, July 7). Time for class–COVID-part 1: A national survey of faculty during COVID-19. Tyton Partners and Every Learner Everywhere. https://tytonpartners.com/library/time-for-class-covid19-edition-part-1/
Ginder, S., Kelly-Reid, J., & Mann, F. (2019). Enrollment and employees in postsecondary institutions, Fall 2017; and financial statistics and academic libraries, Fall 2017. U.S. Department of Education. https://nces.ed.gov/
Harwell, D. (2020, November 12). Cheating-detection companies made millions during the pandemic. Now students are fighting back. The Washington Post. https://www.washingtonpost.com/technology/2020/11/12/test-monitoring-student-revolt/
Hong, E., & Karstensson, L. (2002, April). Antecedents of state test anxiety. Contemporary Educational Psychology, 27(2), 348–367. https://doi.org/10.1006/ceps.2001.1095
Kadakia, C., & Bradshaw, A. A. (2020, May 6). Equitable exams during COVID-19. Inside Higher Ed. https://www.insidehighered.com/advice/2020/05/06/changes-instructors-should-consider-administering-and-grading-exams-during-covid
Kulik, C., Kulik, J., & Bangert-Drowns, R. (Summer 1990). Effectiveness of mastery learning programs: A meta-analysis. Review of Educational Research, 60(2), 265–285. https://doi.org/10.2307/1170612
Lederman, D. (2019, December 11). Online enrollments grow, but pace slows. Inside Higher Ed. https://www.insidehighered.com/digital-learning/article/2019/12/11/more-students-study-online-rate-growth-slowed-2018
Means, B., & Neisler, J. (2020, July). Suddenly online: A national survey of undergraduates during the COVID-19 pandemic. Digital Promise. http://hdl.handle.net/20.500.12265/98
Morgan, G. (2020, April 28). Market guide for remote proctoring services for higher education (Report ID. G00723213-15 2020-04). Gartner. https://www.gartner.com/en/documents/3984283
Online proctoring / remote invigilation – Soon a multibillion dollar market within eLearning & assessment. (2019, February 19). LearningLight.com. https://www.learninglight.com/remote-proctoring-invigilation-market
Pascarella, E. T., Pierson, C. T., Wolniak, G. C., & Terenzini, P. T. (2004). First-generation college students: Additional evidence on college experiences and outcomes. The Journal of Higher Education, 75(3), 249–284. https://doi.org/10.1080/00221546.2004.11772256
Redden, E. (2021, February 5). A spike in cheating since the move to remote? Inside Higher Ed. https://www.insidehighered.com/news/2021/02/05/study-finds-nearly-200-percent-jump-questions-submitted-chegg-after-start-pandemic
The Insight Partners. (2020, December). Online exam proctoring market forecast to 2027 - COVID-19 impact and global analysis by type (advanced automated proctoring, recorded proctoring, and live online proctoring) and end user (schools and universities, enterprises, and government). Market Research Report, ID: 5237901. https://www.theinsightpartners.com/reports/online-exam-proctoring-market
Woldeab, D., & Brothen, T. (2019). 21st century assessment: Online proctoring, test anxiety, and student performance. International Journal of E-Learning & Distance Education, 34(1), 1–10. http://www.ijede.ca/index.php/jde/article/view/1106/0
Woldeab, D., Lindsay, T., & Brothen, T. (2017). Under the watchful eye of online proctoring. In I. E. Alexander & R. K. Poch (Eds.), Innovative Learning and Teaching: Experiments Across the Disciplines (pp. 147–160). University of Minnesota Libraries Publishing.
Woldeab, D., Yawson, R. M., & Osafo, E. (2020, June 1). A Systematic Meta-Analytic Review of Thinking beyond the Comparison of Online Versus Traditional Learning. e-Journal of Business Education & Scholarship of Teaching, 14(1), 1–24. https://www.researchgate.net/publication/342661075_A_Systematic_Meta-Analytic_Review_of_Thinking_beyond_the_Comparison_of_Online_Versus_Traditional_Learning
Daniel Woldeab is an Associate Professor in the College of Individualized Studies at Metropolitan State University. He holds a bachelor’s degree in computer information systems, a master’s degree in education, and a doctoral degree in Work and Human Resource Education (adult education). His research interests include online education, adult literacy and adult education, technology and pedagogy, technology and culture, cultural competency, acculturation, acculturative stress and anxiety, and online exam proctoring. E-mail: daniel.woldeab@metrostate.und.edu
Thomas Brothen is Morse-Alumni Distinguished Teaching Professor in the Department of Psychology at the University of Minnesota-TC and holds bachelor and PhD degrees in psychology. His primary research has involved developing and examining online course management systems, and other technology, to improve post-secondary student learning; the teaching of psychology and how technology can be utilized to improve it; and the use of psychological theory to guide large-scale educational interventions. Email: broth001@umn.edu