When students read and collaboratively mark up a text together, is there a meaningful impact on their engagement with the content and their learning overall in the course? If there is, does that impact translate to improvements in their writing?
Hodgson wanted to find out whether students reading and collaboratively marking up a text together might have a meaningful impact on their learning in a course. If it did, he planned to investigate whether that engagement translated into impacting their writing. Hodgson used Hypothesis, a social annotation tool available in Canvas, to power these social annotation practices.
Working with the eLearning Lab provided Hodgson access to types of data he wouldn't have been able to work with otherwise, at a scale that would've been unmanageable previously: between 1,500 and 2,000 students' worth per semester, across multiple classes and course levels spanning three semesters.
Without the eLearning Lab, we would not have been able to come even close to accessing the data: one, mostly because Canvas data doesn't come down nice and clean like a file set. And two, because we were getting data from Hypothesis that had to be not only de-identified and cleaned, but merged with our Canvas data, and that is a skillset that I do not possess, especially at scale.
While social annotation has been used and evaluated at the level of an individual course previously, the successes and challenges have not been generalizable. Performing his research across so many students in so many different contexts provided Hodgson and his team the data they needed to determine whether Hypothesis truly had a measurable, widely applicable impact on student engagement.
The data give us access to something we couldn't do before, and then the technology involved makes it possible to do things that we just can't do at scale otherwise. So that's why it's critically important that we've been able to partner with the eLearning Lab—they've been able to help us set up the inquiry and to learn how to ask questions with the tools, to at least start to identify where we might make a subset of data we can look at with the human element.
What did he discover?
Hodgson's initial findings have demonstrated that students' interactions with Hypothesis have increased their engagement with course readings. Instead of completing a comprehension check quiz or a reading response to measure how they are approaching the content after the fact, students are sharing their thoughts among their peers as they read. It's not busy work, but instead an integrated practice that is teaching them how to become better thinkers, better writers, and better critical engagers.
The more we can prioritize the meaningful experiences that students have in that process of learning, the better their engagement is with the content.
Hodgson has not only found that the content students are generating demonstrates an increase in engagement with their readings, he's also received positive responses both in end-of-course student evaluations and feedback from the faculty teaching the courses. He's now using Hypothesis in nearly every course that he teaches, and the class conversations no longer start with, "What do you think about what you read?" Students have already shared that in the Hypothesis discussions and threads. Instead, in-class conversations simply begin at a deeper level.
For the instructors, the data produced (in other words, the output of student writing) helps with gauging students' understanding of rhetorical modes like how principles of arrangement or logics of argument work. Language and practices keep evolving over time, so Hodgson's research is a way of assessing whether longtime teaching practices need to evolve as well. For him, it all comes down to being able to demonstrate learning and engagement, rather than quoting or parroting something back at you.