The 2024 Preview Post: Welcome to the Toxic Waste Dump

Hello! If you’re reading this, I hope you’re thinking of joining us on this journey (or maybe you’ve already signed up). Welcome. I want to tell you a little more about what you can expect from these posts.

If this is your first Digital Detox with us, you can check out the archive to get a sense of the vibe. Last year, we talked all about AI, and the most popular post was one about AI and writing instruction. The year before that, we talked about systems and burnout and the “return to normal,” and the most popular post was about the critical importance of shared governance. There are four years of archives to peruse, but those posts give you a good idea of what you’re signing up for.

The theme for this year — Welcome to the Toxic Waste Dump That Is Educational Technology in 2024 — is a bit aggressive, to be sure. But it comes from an honest place.

When I started working in educational technologies in 2019, I was extremely optimistic about the opportunities technology offered to our learners. At that stage of my career, I had been a classroom instructor in higher education for nine years, and I knew what technologies offered my students from a teaching and learning perspective. I was excited about all of it, always exploring new toys and new tools and always eager to understand the next big thing. But I very rarely had any reason to look under the hood of a tool, and while I would have characterized myself as having an above-average understanding of our provincial privacy laws as it related to my job, I really didn’t think about those things too much.

Five years on and I’m a very different EdTech! Those who have followed the Detox over several years will see that most of my techno-optimism has fallen away, replaced by a cynicism about how these tools operate, how much they cost, what they take from our students, and how our universities have been content to outsource and offload and devalue in-house expertise in favour of “the cloud.” I know I can come off as a luddite (spoiler: I am in fact super pro-luddite) and I tend to irritate everyone because IT evangelists think I’m an idealist and classroom teachers who just want to get something done find me obstructionist. Good to know thyself.

An AI-generated image of Brenna as a FunkoPop character complete with display box. She is wearing chunky glasses, a cardigan, and skinny jeans.
Portrait of your author as an AI-generated trinket.

But this perspective on technology didn’t emerge out of nowhere. Over the years, I have become increasingly aware of the appetite technology companies — and maybe especially educational technology companies — have for learner data. But I’ve also learned just how much harm our sector is willing to allow these companies to cause in the name of supposed shared values like rigour and student success. Institutions spend a lot of time weighing risk, but the things I think matter — like the reputational risk of aligning with test proctoring software that is demonstrably inequitable or generative AI tools built on a philosophy that runs counter to the very spirit of the university — never seem to be part of that equation.

See, I’m a buzz kill.

Every technology we adopt is a choice made by someone, somewhere. There is no inevitability. You don’t have to be a Google school or a Microsoft school. The use of plagiarism detectors, or learning management systems, or habit trackers, is a choice. Sometimes it’s a choice made on our behalf — truly, nothing annoys me more — but it wasn’t ordained by the gods. We have this sense, often, that technologies just happen to us. But every university-sanctioned technology is the outcome of a procurement process, one that weighed (or failed to weigh, also a choice) moral and ethical concerns against efficiencies and cost and came to a conclusion. This is true even of the bad tech. This is true even when we pretend we didn’t know, couldn’t know.

This is the fifth anniversary of the TRU Digital Detox, and probably it’s last iteration in this form as I move on to other projects. So we’re going out with a bang. I’m going to try to show you, over 10 essays this month, how I came to be like this and why I believe the technologies our education system has come to rely on are so very toxic. We’ll talk about equity and access, privacy and security, data hygiene and harm. We’ll talk about how the problems in our classrooms we are trying to solve with technology are real, and we’ll talk about why the solutions that have been sanctioned are so bad.

But I’m not really here to depress you. I’m here to help you understand the scope of the problem and to encourage you to want to fight back. I operate from the position that everyone working in higher education wants it to be an equitable, ethical space, they just might have lost track of how we achieve that and the important role technologies play. Every essay will also share other choices we could make and harm-reducing approaches to using the tools we can’t escape, as well as strategies for resistance and refusal.

So please, register for this year’s Digital Detox and join me as we explore the technologies that make up our working (and learning) lives and find the language to demand better.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *