Photo courtesy of Pixabay

Not your average test anxiety: How universities are using AI at the expense of student mental health and privacy

By Emma Kilburn-Smith, January 13 2021—

This past semester held new learning challenges and opportunities for University of Calgary students and educators alike. Some students embraced the new attire and classroom setting — pajamas and sweatpants in bed, while professors and teaching assistants (TAs) had no choice but to befriend the new technologies, in all of their glory and awkwardness, that made classes possible this year.

Professors and TAs learned the true meaning of “teaching into the void.” Yet there were surprising moments of connection brought about by the shared experience of 2020’s unprecedented events and the strange circumstances in which we all found ourselves: the unexpected intimacy of inviting people into our homes virtually, and of course, the joy of pet cameos that united animal lovers globally. But despite these shared experiences, the online environment is still a poor substitute for physical classrooms that facilitate the development of trust and connection between students and teachers. This crisis in relationships is a major factor contributing to the appeal of surveillance software in the virtual classroom.

Since the start of the pandemic, universities have scrambled to find ways to maintain academic integrity in the online classroom, with special consideration for how to accurately test student learning in an unpredictable environment like private living spaces. Companies have emerged to fill this need by contracting out e-proctoring software, but the technology is still new to the classroom and it is already raising ethical and educational concerns. 

Dr. Ceceilia Parnther is an assistant professor of administrative and instructional leadership at the School of Education at St. John’s University in New York City and is studying the use of e-proctoring services in the classroom. Her research specifically involves e-proctoring software that requires users to authenticate their identity in order to participate in the online learning environment. There are numerous ways, according to Parnther, that these technologies approach authentication: through the use of facial recognition, having students scan their physical space to ensure that no one else is in the room or that there are no ambient sounds, monitoring eye and mouth movements to determine if students are looking away from their exams or speaking to someone during the exam process and even technology that will lock down the browser and internet so that students can’t open up other windows to search for answers when writing exams.

After the exam is completed, a percentage of trustworthiness is calculated by the software’s algorithm which accompanies the student’s exam to be graded by the professor or the TA. Trustworthiness is based on the software’s evaluation of how easily the student was able to authenticate themselves or remain consistent during the exam. It is up to the person grading the exam to decide whether to either evaluate the report themselves or to rely solely on the software’s evaluation in order to determine whether or not the student was committing academic misconduct. Parnther says that often graders don’t review the trustworthiness of the report and instead just assign zeros for academic misconduct.

There have been many anecdotal reports worldwide about how these new methods of testing are negatively impacting student anxiety and increasing the stress of exams. Students are no longer just worrying about the content of their exams, but also their testing environments, bodily movements, the reliability of their technology as well as their internet connection. Furthermore, these anxieties may be more pronounced for students belonging to a racial minority group or a lower socio-economic status, as they are stigmatized and disadvantaged by the new testing methods.

This is a topic of interest for Parnther, who says that when speaking about equity in this context, she is talking about how “certain students are suppressed further because we’re automatically jumping to surveillance.” Most e-proctoring software relies on artificially intelligent (AI) software to authenticate and monitor students. 

In regard to AI’s influence on minorities, Parnther says that “the science is there. AI is still catching up on being able to discern the facial characteristics of minority groups.” Some examples she gives are of how soap dispensers and the Apple ID often struggle to detect darker skin or to distinguish between the features of Asian women, respectively. In the case of student examinations, there have been instances where students of colour are told to find better lighting because the software does not recognize their faces. Parnther also questions the potential impacts this could have on students who identify as non-binary. Algorithms operate in a binary way, she says, forcing students to identify themselves in normative terms that make sense to the technology.

In addition to the ethical implications of identification, there is the question of data collection. Parnther explains how “part of the participation of using this software is signing off on someone being allowed to capture and store this information.” Companies are not always transparent about who has access to this information and in the case of minoritized populations, this issue comes with historical baggage. Parnther says that in the United States, there have been issues with marginalized populations who have had their data used for ill gains and been lied to about the circumstances surrounding data collection. For some students, the idea of being surveilled could trigger deep and complex anxieties.

Parnther is careful not to assign malicious intent behind the development of e-proctoring technologies. She says that these oversights aren’t because “some engineer decided that they didn’t want everyone to be able to wash their hands, but when you have that many oversights and it extends into student learning and assessment then it’s something that we really have to pause and ask ourselves about.” She says that she doesn’t believe that these companies set out to be racist or biased, but “when normative assumptions are being placed to determine trustworthiness, we have a problem. Because my skin tone doesn’t match what the algorithm can easily pick up, so I have to adjust, or my eye movements don’t match what the algorithm says is appropriate so now I have to adjust […] how many adjustments am I making before I even get to the business of demonstrating learning?”

Dr. Sarah Elaine Eaton is an associate professor at the Werklund School of Education at the University of Calgary and is also concerned with the impact this technology could have on the diverse student body at the University of Calgary. She describes a recent case where a student living with family was writing an exam using e-proctoring software, when an older member of the family entered the room and started questioning the student about what they were doing in their family’s native language. Despite the student’s desperate attempts to get the family member out of the room so that they could complete their exam, the test was flagged as a case of misconduct.

Eaton uses this as an example to highlight how e-proctoring software could “discriminate against students who might be living in multi-generational homes with other people of the house who might not be familiar with our educational system or possess English language abilities.”

She points out that the agreements between e-proctoring companies and universities are often multi-year contracts, so there is no trial run for these services. They are also very expensive technologies, causing Eaton to question whether that money might be better spent on staff and professors to help support students through this period, rather than on technology with an effectiveness that is still being contested. Even now, companies are developing technologies to undermine e-proctoring software. One way they are doing this is through manufacturing video loops to combat the software’s monitoring capabilities. There has also been no data on how much it would cost the university in increased labour charges if these softwares were implemented.

“I wouldn’t be surprised if the technology added significantly to academic workload for professors and TAs who will be spending a lot more time reviewing videos for cases of academic misconduct,” said Eaton.

For now, the University of Calgary stands out against its Calgary post-secondary counterparts, SAIT and Mount Royal University, which do make use of e-proctoring. MRU however, did not use it in any for-credit programs in the fall semester and has no plans to do so in the winter semester either. U of C has announced that at least for the winter semester it will not be using e-proctoring software for exams either, but Students’ Union president Frank Finley cautions that “the university could change its mind at any point.” He encourages students who may be concerned about the implementation of this technology to make their voices heard. Students can reach out to the Students’ Union, the head of their departments or the dean’s office of their faculties to make their feelings known. Their feedback can then be saved “to continue the fight in the future” if need be, says Finley.

Finley was involved in the crisis management team at the University of Calgary over the summer months and had been advocating against the implementation of these technologies from the beginning. He observed that a lot of the support for e-proctoring technologies came from seeing how other universities in Calgary and across Canada had chosen to sign on with these companies, but Finley believes that it is “a poor excuse to do something simply because other universities are doing it themselves.” 

Eaton echoes this skepticism, saying that up until now, universities have only been hearing about the success of these softwares through company sales pitches, but “sales pitches aren’t data.” She says that as a research institution, the University of Calgary should be basing its decisions off of research.

It is important that students remember that they have a say in their education and have a right to stand up for themselves, especially if they feel that their learning experience is being compromised or is at risk. Parnther says that she believes “change only really comes when students stand up and say ‘this is uncomfortable… I have anxiety over this. I can’t participate in the learning experience because I have to do all of these things just to be able to take an exam to show what I’ve learned.’ That’s not why you come to college. I think change comes when students realise that they have rights and recognize that we should be assessing learning in ways that don’t devalue them as people.”

Editors Note: A previous version of this article stated that MRU had been using e-proctoring since the fall but we have since learned that it was not used in for-credit courses.


Hiring | Staff | Advertising | Contact | PDF version | Archive | Volunteer | SU

The Gauntlet