A stencilled camera on a cement wall.
|

Every Breath You Take, Every Move You Make: Surveillance and the True “New Normal” of the Moment

Happy Valentine’s, Detoxers. Enjoy the earworm. PS. Anyone who claims to find this song romantic has got to be lying.

When everyone first left the campus and went home to teach and learn remotely, there was a lot of talk about how we would reduce assessment loads and reframe our evaluations to meet the moment. Many institutions offered a Pass/Fail option in the immediate term and instructors were encouraged to find compassion for students in crisis.

And then, you know, universities gotta university. By January 2021, students were reporting a perception of increased workloads and by June 2021, the sector had decided everyone was cheating and somehow the idea that our processes should change in a, you know, global pandemic became a lot less attractive. Instead, what became a lot more attractive was surveillance.

I think there are two things to remember when we think about how and why surveillance technologies were suddenly everywhere as 2020 became 2021.

  1. The pandemic has increased tolerance of a lot of restrictions to civil liberties among everyday citizens, for obvious reasons: we want the pandemic to end, and many of us see some restriction as a reasonable price to pay. But this has also included surveillance in the form of contract tracing or digital health records. So surveillance, in general, is up, and so probably is our communal tolerance of it.
  2. Not having our students in front of us is weird, and some of us respond to situations that are weird by trying to assert control. Are people doing what we told them they should be doing? How can we know? This is the same impulse that results in remote workers being surveilled by employers, and I think our institutions have leaned into this impulse in a big way.

The surveillance was bad through a fully-remote period, but what I find even more concerning is that many of these technologies continue to hang around. We got used to having access to certain kinds of information about student behaviours as they move through our courses, and now it seems to be an expected layer of data for us to navigate. As our students increasingly use our digital platforms to move through our courses, we’re playing the role of Sting and checking in to see every click they make. We have access to tremendous amounts of data that allow us to know when our students are working on what and how they are moving through the materials we provide. But why do we want it?

In general, I think the best pedagogical position is to trust students. Trust them to do the work, most of the time, mostly when it’s supposed to happen. My caveat is that I am not teaching anyone how to save a life, but that’s always been enough for me. We know that some students cheat sometimes, but that it’s highly context dependent, and that there are lots of pedagogical insulations against cheating. We also know that people don’t always do the thing they are supposed to do when they are supposed to be doing it. I haven’t paid 100% attention in a meeting in approximately 39 years (I am 38). I still mostly get my work done. Sometimes I am doing it very late on a Sunday night before a Monday deadline — no comment — but it gets done.

But that’s a lot of wiggle room for an industry that is predominantly about credentialing people, and I get that. The credential needs to “mean something.” But I question how we define that. Why does it “mean something” to be surveilled while taking a badly designed exam — rigour! — but the ineffective exam doesn’t raise questions about whether the credential “means something.” Often we resort to e-proctoring instead of revising exams to assess learning more effectively. What if we assessed the value of a credential not on our perception of cheating or dishonesty, but instead a true measure of academic integrity: the quality of the instruction and assessment itself?

I’m not going to spend this week’s essay railing on against specific tools like e-proctoring and learning analytics. Lord knows I have done that in these pages before. I want to talk about the ethos of surveillance and why I think it’s actually anathema to education. I want to talk about the behaviours and expectations we are really normalizing when we lean into the tools. And I want to propose some opportunities for resistance — individual, yes, but also collective.

The reason why I argue that surveillance is anathema to education is because none of these tools are capable of measuring learning. Analytics and trackers can tell us a lot of things, like where and when students click on course materials, where and when they work on their assignments, and how they use their bodies while taking exams remotely. But none of those things are measurements of learning. Indeed, learning is notoriously hard to measure. We rarely know how much our students know about a topic at the beginning of a course, so instead of measuring learning we are usually measuring against a benchmark of acquired/retained content. We tend to measure performance, not learning. But the data we’re talking about here doesn’t even do that: it measures compliance.

Sometimes I think that the more of these tools we have access to at the level faculty can access them, the more likely we are to mistake compliance for learning. It’s easy to get to a point where we evaluation things like participation and engagement based on these data points; I see it every day. We can’t help it: we measure what is possible, not necessarily what matters. And if you’re struggling with an ambiguous evaluation category like “participation” (there’s a reason I scrapped that unwinnable war), these tools can look attractive. But it’s our job to try to measure what matters.

One of the aspects that concerns me most about all this surveillance-generated data that universities have access to about student behaviour is that most of us have about zero training on how to parse it. You know where in the learning management system your student has clicked, sure — but what does that mean? You know when they opened the Word file — but what does that mean? You know they looked away from the computer during an exam — but what does that mean? In all of these cases and many more, making meaning demands making a series of inferences, and those inferences are loaded with all kinds of assumptions about the behaviours of a good student. Where and when students click is basically meaningless, when you think about all the extraneous factors at play: maybe they got the assignment guidelines from a friend! maybe they read the article already for another class! Where and when a student works on an assignment probably has more to do with their work and class schedule and circadian rhythm than their working habits, and also maybe they just don’t use the university-provided word processing app. And the assumptions we make about how students should look while writing exams — that we then base surveillance AI for proctoring on — is deeply abelist; what if my body needs me to move when I focus, and what if I don’t stare at the computer when I think?

What does it say about the community of educators that we think we can somehow make meaning of these data points?

And my final question is this: what world are we preparing students for? Surveillance apathy is a real and troubling concept, which suggests that as surveillance feels like an increasingly present part of our lives, we feel increasingly powerless to resist it. It’s hard to live in a state of threat, so we begin to shrug it off. But it puts us at risk to treat privacy cavalierly, and when we normalize surveillance as a part of the post-secondary learning experience, we assist the process of making it normal in their professional lives, too. We’re complicit in boiling the frog. That’s bad enough, but when we frame surveillance and e-proctoring and analytics as normal or, worse, neutral, we’re eliding the reality for racialized, disabled, and gender non-conforming students, who are disproportionately negatively impacted by data surveillance. We should be arming all of our students to fight against and protect themselves from these technologies, not gifting the surveillance state a cynical and compliant population.

So what can we do to push back against surveillance practices within our institutions?

  1. Don’t look. This sounds like a non-step, and it kind of is. Refuse to engage with these tools. Whatever is built into your learning management system, whatever tools you are offered: just opt out. This can be on the individual level, but it can also be a collective decision within departments and faculties. This gets stickier when we get into situations like articulation or licensure being dependent on e-proctored exams, for example, but we can refuse where we can while we work on bigger fights.
  2. Tell students. One thing I prioritize when I work with students is being explicit with them about what behaviours are tracked by our learning management system (which, thankfully, is on the less intrusive end of the spectrum). I show students the logs in the LMS so they have a clear understanding of how they are tracked. They need to know this information to make informed decisions about their own course behaviours.
  3. Challenge assumptions and ask why. When these tools are discussed, ask what problem they are trying to solve. Ask what assumptions underpin the problem and the solution. Have the conversation out loud and invite colleagues and decision makers to think differently about students and their relationship to teaching and learning.

I hope you’ll share other ideas for resistance in the comments below. These tools are here now and they won’t go anywhere unless we resist and refuse them. The Sting song creeps us all out for a reason — the idea of a random guy, however tantric, watching our every move is deeply disturbing. May it always upset us. May it never be normal.

Don’t forget! The Detox goes live this week, with our synchronous chat session on Friday, February 18th at 11:30 am Pacific / 2:30 pm Eastern / 7:30 pm GMT. You can register here and no, you don’t have to be part of the TRU community to join in: all are welcome! This is a good place to come and chat with people who care about the same things that you do. Remember: collective refusal, collective resistance. Come and find your people.

Similar Posts

2 Comments

  1. In my institution there’s a pretty solid majority who agree with the idea of authentic assessment and reworking assessments instead of using spyware tools. But… There’s also push-back when it comes to doing the actual work of redesigning assessments, often for very valid reasons. And it *is* work. Sometimes the instructors don’t come equipped with ID skills, sometimes they lack the time to do the work (being paid only for teaching time, for example). In some instances the desire for the college to implement proctoring isn’t so much to do with a perception that students are cheating in online tests, but rather that they see it as a tool that would allow them to carry on teaching as they always have, without having to alter the tests that they’ve been using for years.

    I’d be interested to hear how other postsecondary institutions have managed to square the circle – does your institution provide ID support and protected time to rework assessments, for example?

Leave a Reply

Your email address will not be published. Required fields are marked *