An abstract image showing numbers and data points devoid of context.

Toxic Tech #1: Learning Analytics and the Dangers of a Little Bit of Knowledge

What are Learning Analytics, Anyway?

As we are all increasingly aware, everything we click on leaves a data trail behind it that someone else can sniff out and use to come to all kinds of conclusions about me. Logging in to a loyalty card to check your points balance likely leaves a breadcrumb that allows that company to track where else you go online and send you targeted ads to bring you back to the fold. And anyone with an Instagram account knows that, as far as the internet is concerned, you are what you randomly click on.

So take that sense of always already being surveilled as you move about the internet and then extrapolate to think about how many things we ask students to click on in a day, a class, a semester, a learning experience. Think about how many of those softwares and services they interact with are owned by forprofit ventures for whom data is the core definer of their valuation. Why wouldn’t someone try to package and market those clickity-clicks to tell us something about “learning”?

And that, in a nutshell, is what learning analytics are and do. At their most innocuous, they tell you when and how often a learner clicked on any component of your course. And they have really useful applications in some contexts: for example, in distance learning, analytics can sort of mimic the types of cues you get in a face-to-face classroom that a learner is starting to check out. Learning analytics tell you things like if your students are checking in on the readings, accessing the course space, and when they hand in assignments. Seems innocuous, right? Don’t worry. I can suck the joy out of it.

So, What Is So Toxic About It?

I think it’s always dangerous to simplify the concept of learning down to what is and is not clicked on and how. I have seen “number of times LMS accessed” as a metric for assessing participation in a course more than once in my time, and I think that’s an alarming idea. It’s an impoverished notion of what education and learning is to imagine it can be captured and assessed by clicks, or that the number of times you access the LMS space is the same meaning as class participation.

But if I’m honest, my issues with learning analytics are not really with the data itself (I have perhaps myself become too resigned to datafication), but more about two key issues: consent and judgment.

Yes, in a very surface fashion, your students have consented to using the Learning Management System (LMS). At most institutions, this is a click-through consent that they must accept before gaining access to the technology. Nevermind larger philosophical arguments about whether a click-through is informed consent or what it means to consent to share your data if you don’t understand the larger landscape – though I think those conversations are important to have if we work at universities that claim have some sensibility in other quarters that informed consent is a meaningful construct. But even this pantomime of consent can’t be imagined to be freely given if you have to click accept to gain access to course materials you have already paid for, with no obvious alternative option.

I have a larger question about consenting to the sharing of data, however: do students really understand what the professor can see when they look at the LMS logs? Probably not. This r/professors thread on Reddit (which, warning, is a pretty toxic place if you think students are people) goes into fairly snarky detail about student confusion regarding tracked data – and a sense, I would argue justified, of outrage once they do understand. Everytime I talk to students about what actions their professors can track in the LMS, they are surprised (and we have a fairly anodyne, plain vanilla set of logs at TRU).

(It might not surprise you that when we do give students the opportunity to opt out of this kind of data collection, students who already have reasons to distrust the university – often historically marginalized learners – are more likely to choose to opt out.)

Some folks will argue that keeping this information secret from students allows faculty to catch bad actors. Maybe. But I don’t subscribe. Instead, I make it a point now of sharing this information whenever I do workshops on the LMS for students. I screen share and show them exactly what the LMS is tracking of my behaviour.  Because here’s the thing: if it changes your behaviour to know someone is watching, that’s not a reason for you to be spied on unknowingly. When I am alone in my house, I eat spoonfuls of peanut butter out of the jar. I don’t want you to watch me do that, so I don’t do it when I have people over. My choice to change my behaviour with different contexts doesn’t make it okay for my neighbour’s Ring camera to capture a picture of me and my peanut butter at the kitchen window.

I think about all the goofy things I used to do as a student to study for exams. I had a superstition about re-reading the first page of every assigned reading. Now, you would be able to see me clicking through and doing this on the LMS, but that doesn’t mean I’ve invited you to have a conversation with me about it. Maybe it’s silly. Maybe I know it’s silly! And maybe it’s none of your damned business.

When learning moved into the homespace with aggression in 2020, it became clear to me that learning analytics – especially those used under the guise of exam security and academic integrity – were being used to police and critique students homespaces in ways that perpetuated inequities. We are handed a tremendous amount of data in our professional lives as educators, but we are not often encouraged to stop and say: should I know this? Do I need this? Is it any of my business when my student writes their paper, if they hand it in on time? Is it any of my business how often they click through my readings, if they show responsibility for it on the exam?

When did we decide we had a right to all this data, anyway?

Speaking of data, I do think there’s a marked difference between learning analytics that simply collect data and those that go on to draw conclusions about learners, and this is where I come to my second concern: judgement. Some of these tools will label a student “at risk” or flag behaviours (like always submitting papers close to the deadline) and encourage you to make an intervention. Some will label students as having “good” or “bad” habits based on this kind of data. Last year, we talked about my concerns about outsourcing teacherly judgement to AI, and my concern there stands. Learning is deeply relational, and I simply don’t believe that relationship-free data can make meaningful conclusions about students. As I said last time around:

Of course, the other thing about the machine is that it has no idea the extent to which a learner has developed before coming to the machine; it doesn’t know what the learner overcame to get to class; it doesn’t know the learner’s intentions, priorities, or goals. I’m pretty bummed out about the idea of education without those factors taken into account. 

We tend to believe that data is neutral and that the decisions machines make are more unbiased than the ones humans can. But I still believe those relational qualities matter, and I don’t believe data is neutral or that how we read it is separate from our own biases.

In fact, faculty who do choose to use learning analytics need to be appropriately trained and supported in doing so. At most institutions, the analytic tools are just… there. Use ‘em, don’t, who cares. But how we make sense of data is informed by our understanding of the tool and its limitations and also our own biases. If institutions are going to subscribe to learning analytic tools, they are morally obligated to also provide support and training to ensure the tools are used appropriately.

If we can’t do that, maybe we should just turn ‘em off.

Strategies to Detoxify the Tool

Lol just kidding no one is going to turn off analytics. So here are some things you can do to detoxify how learning analytics are employed in your own courses: 

  1. Make sure you understand what about your learner behaviour is being tracked and talk about it with your students. Talk about how you use it, too – think about your use cases and share those with your students.
  2. Seek out professional development opportunities to learn how to read the data properly and to understand the limitations of the tool.
  3. If a particular data point is not relevant to you, don’t look at it. The data is there, but we don’t have to make it part of our assessment – explicitly or implicitly. It’s okay to opt out. Increasingly, I am coming to see that the power we have as faculty is in principled refusal.
  4. Avoid allowing a tool to make a conclusion about your learner for you. Only you know the contexts of a student’s learning. Maybe your student hands in every essay late because they work a time-consuming job and their writing hours are limited. The learning analytic doesn’t know your student. You do.

Similar Posts


  1. Oh no! I just got interested in learning analytics after doing an elective on it as part of the MET. And now you’re telling me it’s toxic. It’s like when found out Van Morrison was not only a curmudgeon (quite acceptable) but also an anti-vaxxer (less so).

    Seriously though, I think your injunction to use things like LA purposefully and with knowledge of what they can and can’t do (as well as being aware of the structural inequalities that the can help maintain) is wise.

    In your experience with LA, did you come across/read Niall Sclater’s “Learning Analytics Explained”? I think he does a pretty good job of not overselling the possibilities.

    1. I think you’ve got it exactly right here, Andrew: they need to be used mindfully. But they aren’t sold that way, especially when a faculty member gets an alert saying something like “this student is at risk.” Of course they jump to attention! We need better analytics literacy and training. Thanks so much for the additional reading. 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *