When Haley, a sophomore at Indiana University, took a test for an accounting class in September, she—like many college students during this pandemic—was sitting not in a classroom but in her bedroom. And instead of a teacher watching for signs of cheating, there was something new: an AI, studying Haley’s every move through her laptop’s webcam.

The university was conducting remote exams using Respondus, a type of “online proctoring” software. The software locks down a student’s desktop so they can’t switch tabs to Google an answer, and then it uses visual AI to examine—among other things—their head movements to judge whether they’re looking somewhere other than at the screen.

Haley’s head was setting off alarms. “I guess I slouch when I’m sitting,” she tells me, so at one point the software flashed a scary warning at her. “It stopped my test, and it popped an alert on the screen saying we can’t see your face any more.” Unsettled, she began to stare more robotically at her screen.

Haley finished the exam and got a good grade. (I’m using only her first name at her request.) But the stress of that HAL-level surveillance? Yikes.

“Online proctoring” isn’t new. It’s been around for over a decade, used mostly for distance education or corporate accreditation tests. But as formerly F2F colleges went remote because of Covid-19, its use spread like ivy. Mike Olsen, the CEO of Proctorio—one firm that makes such wares—says their business has risen by 900 percent since the spring. “April was one of our craziest months,” he adds.

We talk a lot about the rise of surveillance capitalism and ponder the grim future to which that Orwellian path leads. But for students? That future is now, as they try to act dutiful in front of their glowing webcams.

It’s a dreadful experience, they’ll tell you. Some systems identify possible cheats using AI; in others, a live human, employed by the firm, stares at you. Oodles of Reddit posts catalog moments of violation: housemates or family unwittingly captured on camera, normal body movements flagged as illicit behavior, and the existential exhaustion of performing obedience. “This legitimately scares the fuck out of me,” one student posted.

It sets a terrible civic precedent. “We are indoctrinating our youth to think that this is normal,” says Linsday Oliver, head of activism at the Electronic Frontier Foundation. Students trained to accept digital surveillance may well be less likely to rebel against spyware deployed by their bosses at work or by abusive partners. “What are we telling them about what they should expect for the rest of their lives?”

Universities plead that they need some way to prevent academic malfeasance, which is a real thing. A recent survey found that just over 30 percent of students admit to having engaged in some form of cheating. Administrators tell me they try to be as respectful as possible of student privacy: “We work very hard not to be invasive,” notes Brian Marchman, the director of distance and continuing education at the University of Florida. For example, his school encrypts any video or data collected by the proctorware, and it is professors—not the proctoring companies—that make the final decision on whether cheating occurred.

All fair enough. But there’s something bonkers about trying to parse the most ethical way to creep on students. The rise of proctoring software is a symptom of a deeper mistake, one that we keep making in the internet age: using tech to manage a problem that is fundamentally economic.

After all, there are other ways to assess students that minimize the chances of cheating. Rather than give multiple-choice tests, you could ask them to “do more applications-based projects or essays,” Haley says. We could ask students to engage in serious, real-world tasks: “There are a bunch of Wikipedia articles that could be worked on,” says Audrey Watters, author of the blog Hack Education. If you give students complex projects, you don’t need to ban Google, because there’s no simple answer.

You May Also Like

Warning over hacked USB cables that can hijack your Gmail or Hotmail – check yours now

YOU might want to check your iPhone cable. A TikTok star has…

Fake Meat Is Bleeding, but It’s Not Dead Yet

The plant-based industry has also struggled with claims that its products are…

US Election 2020: Google shares poll and ballot drop box locations

The 2020 election is said to be ‘the most important election in…

Billions of Google Chrome users warned over website they should never search for – you could lose everything in the bank

SECURITY experts have issued a warning over a scam website that you…

When Haley, a sophomore at Indiana University, took a test for an accounting class in September, she—like many college students during this pandemic—was sitting not in a classroom but in her bedroom. And instead of a teacher watching for signs of cheating, there was something new: an AI, studying Haley’s every move through her laptop’s webcam.

The university was conducting remote exams using Respondus, a type of “online proctoring” software. The software locks down a student’s desktop so they can’t switch tabs to Google an answer, and then it uses visual AI to examine—among other things—their head movements to judge whether they’re looking somewhere other than at the screen.

Haley’s head was setting off alarms. “I guess I slouch when I’m sitting,” she tells me, so at one point the software flashed a scary warning at her. “It stopped my test, and it popped an alert on the screen saying we can’t see your face any more.” Unsettled, she began to stare more robotically at her screen.

Haley finished the exam and got a good grade. (I’m using only her first name at her request.) But the stress of that HAL-level surveillance? Yikes.

“Online proctoring” isn’t new. It’s been around for over a decade, used mostly for distance education or corporate accreditation tests. But as formerly F2F colleges went remote because of Covid-19, its use spread like ivy. Mike Olsen, the CEO of Proctorio—one firm that makes such wares—says their business has risen by 900 percent since the spring. “April was one of our craziest months,” he adds.

We talk a lot about the rise of surveillance capitalism and ponder the grim future to which that Orwellian path leads. But for students? That future is now, as they try to act dutiful in front of their glowing webcams.

It’s a dreadful experience, they’ll tell you. Some systems identify possible cheats using AI; in others, a live human, employed by the firm, stares at you. Oodles of Reddit posts catalog moments of violation: housemates or family unwittingly captured on camera, normal body movements flagged as illicit behavior, and the existential exhaustion of performing obedience. “This legitimately scares the fuck out of me,” one student posted.

It sets a terrible civic precedent. “We are indoctrinating our youth to think that this is normal,” says Linsday Oliver, head of activism at the Electronic Frontier Foundation. Students trained to accept digital surveillance may well be less likely to rebel against spyware deployed by their bosses at work or by abusive partners. “What are we telling them about what they should expect for the rest of their lives?”

Universities plead that they need some way to prevent academic malfeasance, which is a real thing. A recent survey found that just over 30 percent of students admit to having engaged in some form of cheating. Administrators tell me they try to be as respectful as possible of student privacy: “We work very hard not to be invasive,” notes Brian Marchman, the director of distance and continuing education at the University of Florida. For example, his school encrypts any video or data collected by the proctorware, and it is professors—not the proctoring companies—that make the final decision on whether cheating occurred.

All fair enough. But there’s something bonkers about trying to parse the most ethical way to creep on students. The rise of proctoring software is a symptom of a deeper mistake, one that we keep making in the internet age: using tech to manage a problem that is fundamentally economic.

After all, there are other ways to assess students that minimize the chances of cheating. Rather than give multiple-choice tests, you could ask them to “do more applications-based projects or essays,” Haley says. We could ask students to engage in serious, real-world tasks: “There are a bunch of Wikipedia articles that could be worked on,” says Audrey Watters, author of the blog Hack Education. If you give students complex projects, you don’t need to ban Google, because there’s no simple answer.

You May Also Like

China Hacking Was Undetectable for Some Who Had Less Expensive Microsoft Services

What to Read Next This post first appeared on wsj.com

Californians gathered in a cave 500 years ago to get high under painting of a hallucinogenic plant

Native Californians gathered in a cave — underneath a painting of a…

What the SECRET symbols on your iPhone REALLY mean

If you’re not familiar with the mysterious symbols on your cellphone, you’d…

Elon Musk’s $2.9bn Twitter stake comes with few upsides | nils pratley

The Tesla boss could lose focus on the electric car revolution if…