You have /5 articles left.
Sign up for a free account or log in.
The headline of the lead article in the Feb. 27, 2024, edition of Inside Higher Ed says, “Facial Recognition Heads to Class. Will Students Benefit?”
I have an answer to the question posed at the end: no.
No, facial recognition will not benefit students. It will not benefit instructors. It will not benefit institutions. Instruments of surveillance and social control have no relevance to learning and human development. This is not a concept that needs testing or exploration.
The answer is no. I’m having a hard time believing anyone thinks otherwise, at least if student learning is at the center of the conversation, but since this appears to be a real thing, let’s review the issue.
First, here is the vision for the technology in development by Chafic Bou-Saba of Guilford College, whose aim is to design “a facial recognition system for classroom management.” As described by Susan D’Agostino in her reporting in the initiative,
“Multiple cameras spread throughout the room will take attendance, monitor whether students are paying attention and detect their emotional states, including whether they are bored, distracted or confused.”
It is hard to know where to begin.
The False God of Attention
Perhaps we should start with the fact that “attention” as it appears via externally available evidence detectable by surveillance technology is not the same thing as learning. As I described in a post back in 2017, when an earlier version of similar technology was proposed, attention is a “false god” around which to orient the classroom atmosphere. Attention can be faked. Thinking, the thing we are actually trying to engender in students, can often look like distraction.
Rodin’s Thinker was distracted from his lecturer because he was lost in thought. What is higher calling of a moment in a classroom than getting students to lose themselves in thought?
The thing we should be focused on in curiosity, not attention, an internal state not detectable by surveillance cameras. By equating attention with engagement, we are telling ourselves a lie about how learning works.
In fact, research suggests that daydreaming is a route to learning and should actually be viewed as a kind of focus in and of itself, a focus turned inward. Why would we want to engineer this out of the classroom?
Surveillance Is the Enemy of Freedom, and Freedom Is Necessary for Learning
The increasing use of tools of surveillance has been a hallmark of the lives of multiple generations of students. Consider the now-ubiquitous courseware and learning management systems that monitor student work and grades, making them an ever-present concern.
Additionally, many students have been subject to explicit behavior tracking apps like ClassDojo, where they are monitored and “scored” in real time. This behaviorist approach narrows the aperture through which students are allowed to pass if they’d like to be judged successful in school contexts.
A culture where standardized testing is both ubiquitous and of significant value can also be seen as a form of surveillance where students are expected to fit a specific—once again narrow—profile in order to advance in their educations.
The predominant response to these impositions of surveillance has been an understandable combination of fear and anxiety. Ask yourself how student attitudes toward school have been going as surveillance has become increasingly normalized in school spaces.
In addition to curiosity, one of the most important things we can help students do is to establish and build a sense of personal agency and self-efficacy. Monitoring their behavior in real time with the presumption that an instructor or algorithm will intervene if they are exhibiting undesirable responses is the antithesis to achieving this goal.
What’s the endgame—lecture hall seats wired with devices capable of delivering a shock if students are found to be inattentive?
There is no scenario where this ends well once it is allowed a presence in our classrooms.
First Do No Harm
Let’s imagine that this technology can accurately identify different moods based on facial recognition. (This is already a dubious proposition, but let’s imagine.) Even in the best-case scenario, there will be false positives, where students who are actually paying attention are flagged.
What should students do when they are unjustly flagged by the surveillance system? Should they shrug it off? Should they dispute it, knowing that it will be impossible to prove they were not distracted, given that engagement is an internal state and the technology is reading their outward appearance?
What are the consequences to a system like this where students can be—at least in theory—punished by an algorithm that cannot be proved wrong, even when it is wrong?
Back in 2019, I posited the necessity of a Hippocratic oath when it comes to the use of algorithmic intervention in education; if students are going to be subject to algorithmic interventions detached from human judgment, any employment of this technology must be predicated on a “first do no harm” basis.
In her reporting, D’Agostino has a rundown of just some of the past misuse and abuse of facial recognition technology and the way it has perpetuated systemic harms. A full accounting of the problems associated with facial recognition would take several volumes of writing.
Surveillance cameras in classrooms cannot possibly pass this test. If we added some kind of human element to the mix, we’ve now lost even that threadbare strand of rationale for its use—its ability to automate something that’s beyond human capacity.
Students Have Rights
Students have the right to be bored. They have the right to get bad grades and not take advantage of their educational opportunities. Thank goodness, because if they didn’t, I would’ve been kicked out of college.
Students are not economic units subject to behaviorist control around maximizing performance. They are people, individuals who should not be subjected to this kind of treatment as a precondition of accessing their own educations—educations they are paying for.
So, what should we do? In this case, the best route is refusal. Students must be given the right to opt out of the monitoring, and they should exercise those rights. Should edicts come from administrators, instructors should refuse to implement this technology in their classes. (If instructors don’t see how this technology will be also turned against them, they should think harder.)
This should be a nonstarter in educational spaces. I hope it goes away as quickly as possible.