top of page
  • Writer's pictureMariel Miller -

The thing about AI proctoring technology

Updated: Apr 30, 2023


Most of us will agree that academic integrity is central to the university's mission and that students should be "mentored into integrous processes to recognize and attribute the contributions of others as well as ways to assure the integrity of their own intellectual work" (Academic Integrity Week, 2022).


During the early days of the pandemic, academic integrity was a key concern as educators shifted to emergency remote teaching. In my former role as Director of Technology Integrated Learning, I fielded daily inquiries about “what technology can stop online cheating.” Software companies capitalized on these concerns, promising a solution in the form of surveillance proctoring using artificial intelligence (AI) to detect cheating. These tools were touted as a cost-effective way to protect the sanctity of high-stakes exams from technology. Universities turned to AI solutions in droves (Educause, 2020). What I came to learn during this time is that, while AI remains an area for research and innovation, its use for detecting cheating at scale is often deeply problematic. When AI is used to 'invigilate' exams, students, educators, and scholars have reported that algorithms struggle to recognize the faces of people with darker skin tones and people who are transgender or non-binary (Buolamwini & Gebru, 2018; Marshall, 2019). AI proctoring also appears more likely to flag 'suspicious behaviour' in students with disabilities, students with certain medical conditions, and students with caretaking responsibilities. As Swauger (2020) notes, “most proctoring software’s default settings label any bodies or behaviours that don’t conform to the able-bodied, neurotypical ideal as a threat to academic integrity.” The discrimination baked into these systems can cause delays in accessing assessments, distress for students during their exams, and erroneous penalties for academic integrity violations. As issues with AI proctoring have become more widely known, some institutions have pulled back on its use. Some technologies companies themselves have softened their messaging, highlighting the need for human interpretation. One thing is clear, in pursuing rigour and academic integrity, we need to pay close attention to the methods used and the temptation to turn to technology as both the cause and the solution of complex human, social and pedagogical problems. Cheating is one such issue that pre-dates online learning and will only continue to evolve as technology progresses regardless of course modality.


Going forward, it is our collective responsibility to centre inclusivity, diversity, decolonization, equity, and accessibility as we determine what it means to learn and teach in a technology-infused society. This is a message worth repeating as attention turns to managing assessment in the age of ChatGPT.

Recommended Readings

Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91). PMLR. Grajek, S. (2020). EDUCAUSE COVID-19 QuickPoll Results: Grading and Proctoring. Educause. https://er.educause.edu/blogs/2020/4/educause-covid-19-quickpoll-results-grading-and-proctoring Marshall, L. (2019). Facial recognition software has a gender problem. University of Colorado. https://www.colorado.edu/today/2019/10/08/facial-recognition-software-has-gender-problem Academic Integrity Week, (2020). University of Victoria Webpage. https://www.uvic.ca/learningandteaching/about/home/eventsworkshops/academic-integrity-week/index.php


12 views

Recent Posts

See All

Comments


bottom of page