Home Opinions Proctor software is more risk than benefit for students

Proctor software is more risk than benefit for students

No exam is so crucial that it should require students to be recorded while taking it

1
Exams are stressful enough without computers spying on students. Illustration: Michelle Chiang/The Peak

By: Nicole Magas, Opinions Editor

Students in MATH 155 were shaken up recently when they received an email informing them that their remote exams would be monitored via proctor software. The news came with alarming suddenness, and although the email stipulated that they could drop the class if they wished to avoid this invasion of privacy, many students pointed out that the email was sent out after the time period to do so had already passed. 

This not only unfairly pigeonholes students into accepting software that is designed to monitor their microphones, webcams, and internet usage, but proctor software on the whole prioritizes a gross invasion of student privacy over more creative — and effective — methods of student assessment. The personal costs associated with using this software nowhere near estimate the benefits, and it should not be allowed in SFU classes.

Proctor software is essentially meant to emulate the experience of taking a timed exam in the presence of an instructor or alternate in-person proctor. Whereas in an in-person class the instructor and/or the class TAs would monitor students for signs of cheating, with proctor software, students are required to allow the software to record via their webcams, microphones, and monitor their internet activity. The program then assesses student behavior for signs of cheating, and passes its results, rather than the recordings themselves, onto the professor. This should raise red flags for a number of reasons.

Even though the data being recorded isn’t being reviewed directly by professors, it still has to sit somewhere while it is being assessed. This leaves it vulnerable to theft because, as we all know by now, there’s no amount of security that can 100% safeguard our data on the internet. That’s just not how it was designed. This issue of who would be ultimately responsible if there were to be a data breach — SFU or the software company — should also worry students as confused accountability makes it harder to achieve compensation for injury. 

The issue of security also raises the question of whether or not it is possible to completely “shut off” the software outside of an exam situation. Google and Facebook are notorious for using their technology networks to gather data on their users. What is stopping proctor software from operating quietly in the background on behalf of a third party, collecting data even when it is not actively in use?

And then there’s the issue of relying on AI to accurately assess human behaviour. How sensitive is the algorithm, and what happens if an honest student is accidentally caught “cheating?” Would the data then go to a human for review, and would that not itself be an invasion of privacy?

But perhaps the biggest problem with courses such as MATH 155 utilizing proctor software is that it’s not even that effective. In the recent Town Hall, Vice-President Academic and Provost pro tem Jonathan Driver admitted that it doesn’t do much to prevent cheating, and that the university prefers if other methods of assessment are used instead. 

And this just underscores what education innovators already know: the attempt to completely recreate the in-person learning environment in digital is built on the assumption that current methods of assessment are the best that we have. This just isn’t the case, and as our world increasingly relies on critical thinking, problem solving, and our interpersonal intelligence, testing “low-level skills” is a barebones way of judging whether or not students are being prepared for life outside of university.

In short, students are being asked to agree to give up their privacy — after the time to opt out has passed — in order to be assessed in an ineffectual, archaic, and anxiety-inducing manner, when other methods of assessment like open-book exams, collaborative work, or applied projects are much better at engaging with student learning. And they have every right to be angry about it.

 

 

1 COMMENT

Exit mobile version