Kevin Jones is an experimenter who does not hesitate to dive in and learn alongside his students as new technologies emerge. His courses focus on organizational development and organizational change, and his teaching is attuned to forces that create change within organizations. When ChatGPT emerged in late 2022, he quickly came down on the side of "Let's use it. Let's help students use it. Let's create the expectations for how it's used."
Kevin Jones
Director, Center for Teaching and Learning & Associate Professor of Management, IU Columbus
By the fall of 2023, most students were aware of generative AI, and they had different opinions and perspectives on its use. Jones took those ongoing conversations as an opportunity to help them understand the potentials and the drawbacks—and, most importantly, how to effectively use it. For him, the tools were part of teaching them how to acquire a job-ready competency.
Students may not use the exact tool or same process in the workplace, but Jones believes that familiarity and comfort with generative AI will help. Jones also created a very clear expectation: "We are not going to use this to replace human thinking. This is your thinking, augmented by artificial intelligence." And he avoided dictating which tools to use, instead pushing students to develop their capabilities and use the best tools for them.
Jones is now fully integrating generative AI into his courses. It will be part of his course projects, assignments, and examinations. There will be a component of AI with all of it. For example, he previously had students complete a design project using what he jokingly refers to as "old-fashioned Adobe tools." Now they're using Firefly.
The results have far exceeded his expectations. Compared to the examples he was able to create as part of explaining the project deliverables, the student groups that fully embraced generative AI did amazing things—so amazing that Jones asked how the students did it. Here's their explanation:
- They started off with some prompting, and they didn't get what they wanted.
- They kept experimenting with it.
- They took it to generative AI to get advice on how to prompt.
- They took it back to Firefly.
- They kept working and tweaking it until they got a design that was marvelous.
What always makes an instructor feel good is when these students exceed your expectations. And that's literally what they did. They started creating their own pathways to learning, which is exactly what I'm trying to teach in terms of critical thinking, problem-solving, collaboration.
Inspired by the results, Jones is taking things a step further to incorporate generative AI into his student examinations. While this might come as a surprise to many instructors, he sees it as being realistic. With the availability of tools like generative AI, there is very little that students can't find in a couple of clicks. He would rather them do the work he believes is necessary when you use these types of tools.
His examinations model the necessary work. Students can't just take the answer at face value: he requires them to validate their sources and challenge supposed truths. In other words, he pushes them to build discernment, an ability to say whether or not this is good information. Jones lays out the process in three steps:
- Use your textbook to come up with an initial response.
- Use generative AI to see how you can supplement your response. Compare and contrast what the book says versus what generative AI says.
- Validate whether generative AI got it correct or not. Start by sourcing where the generative AI got the information. Provide a final answer based on what you find.
Overall, the process helps underscore that generative AI is not always an authoritative source. Jones stresses that students need to "Question it, think through it, look for other information, compare it, and make a decision as to the merits, quality, accuracy, and information."
Jones' advice for other faculty: Join something like the Digital Gardener Initiative, so you can experience generative AI with others. More importantly, go to a training session, hear what people are talking about, learn about the tools. For him, this is non-negotiable. All instructors need to learn about generative AI, then they can decide whether it will work well in their classes or whether it creates too much risk for cheating.
Our work is to design assignments, assessments, and projects that allow people to use these tools in an ethical way. And yes, create enough understanding. We should be learning as much as we can about these tools, and then we can use our discretion as to how best to manage the use of the tools in our classrooms.