The case took place in Florida. A 17-year-old girl in a distance biology class at Broward College received a warning for cheating. The reason? The start-up Honorlock would have surprised her, thanks to her artificial intelligence (AI)acting suspiciously during a review in February. The New York Times looked down on this case after receiving an email from him, posing as “A black woman wrongly accused of academic dishonesty by an algorithm”.
The student was given a grade of zero as a result of this incident. AI isn’t the only one contributing to this sanction: the academic bureaucracy, several humans, and a facial recognition tool designed by Amazon (Recognition) are also responsible.
The Covid-19 pandemic it was a great time for companies offering remote monitoring. The latter have developed multiple browser extensions to detect motion or collect recordings of computer screens or microphones, methods traditionally used by the police.
Honorlock is one of them. Created by a couple of business school graduates, the start-up has managed more than 9 million exams in 2021, requiring $ 5 per test and $ 10 per student. A total of $ 40 million has been invested in the project (approximately € 37.4 million), the vast majority since the beginning of the pandemic.
A question of interpretation
The teen is not the first person to complain about a false allegation of cheating due to surveillance software, but she is one of the few to have received “proof”: a video of her a duration of fifty seconds.
Distance learning was already developing before the pandemic for time-consuming reasons in particular, but it has its limitations. In 2021, for example, a student of art history sent an e-mail to his teacher about a pre-recorded course before find out that he had been dead for two years.
During the exam in February, the teenager followed all the rules recommended by Honorlock: do not use your phone and be alone in the room, among other things. However, she later received an email from her teacher informing her that she had been spotted by the device cheating. It reads: “You’ve been seen looking down and sideways several times before answering questions.”
After an appointment with this teacher (whom the teenager had never met before), where she tried to explain that she was just thinking by looking at her hands, the verdict fell: “Responsible for non-compliance with instructions”. Cooper Quintin, member of the NGO Electronic Frontier Foundationrevolts: “Who sets his exam for the whole test?” It’s ridiculous, it’s not human. Normal software is sanctioned by this software. ”
According to him, the problem is that institutions take the results of AI analyzes for “The word of God”. Honorclock spokeswoman Tess Mitchell said: “In no case do we definitively identify the cheaters, the final decision belongs to the school.” It is therefore up to the teacher to decide whether to sanction the teenager for this “Suspicious behavior.”