In the flipped classroom model, students engage with learning materials outside of class and dedicate class time to active learning and deeper engagement. Now imagine adding conversational AI (artificial intelligence) teaching assistants to the mix. For Antino Kim, this approach has opened up exciting possibilities as an area of active exploration in how education can evolve.

Antino Kim
Associate professor of information systems
Grant Thornton Scholar
Kelley School of Business, IU Bloomington
My biggest resource constraint is time. I typically teach technical courses, and it's a challenge because some students have very limited technical background, and some have extensive programming experience and have built their own systems. I'm just one person and I need to cater to all of them.
Given his research interest in interactions between humans and AI, Kim has been experimenting with how conversational AI agents impact teaching and learning experiences. He has explored the idea of providing students with access to always-on, personalized tutors that allow routine questions (e.g., definitions) to be answered by AI, freeing up classroom time for deeper discussion and applied learning.
In experiments, Kim and colleagues have begun to uncover insights into when and why conversational AI might be useful:
- People actually prefer chatbots over self-serve knowledge bases or search engines because it feels like somebody is paying attention to their issue and helping resolve it (the chatbot is effective in providing perceived human qualities and social presence).
- At the same time, people recognize AI is not human, which removes some of the barriers in human-to-human interaction (such as fear of judgment or the need to exchange pleasantries before and after getting what they need).
By experimenting with AI tutors tailored to course content, Kim aims to create an environment where students feel more comfortable to ask questions without embarrassment or hesitation. These AI tutors can respond at varying levels of technical depth, potentially offering more personalized learning experiences, whether students need foundational explanations or more advanced implementation examples.
I can upload my course content into Copilot and configure the AI agent to answer questions specifically based on that content. That way, I'm able to provide a more tailored learning experience, contextualized to fit a student's needs, and draw boundaries around how I want the AI agent to behave.
(This helps ensure the AI agent provides accurate and relevant information, reducing the risk of "hallucinations" or misleading responses.)
These trials have led Kim to rethink the structure of his courses. He encourages students to review his slides and notes before class, using AI to get explanations or clarifications. When students come to class, he will answer their questions but mostly provide context and make connections. The bulk of class time is spent on application to real life cases and news.
Ultimately, this helps Kim demonstrate why course concepts are important and what their implications are for students' lives. While this shift is still in the early stages, he finds that the level of students' questions and their overall sophistication has increased. The nature of class interactions has changed (in his opinion, for the better), bringing in a lot more "what if" scenarios that lead to fuller conversations.
Instructors often express concerns that students' reliance on AI will lead them to outsource critical thinking to the AI tool, losing the ability to work through complex problems. Kim's approach has been to separate the development of foundational knowledge (which can be supported by AI) from in-class application and critical thinking, which are human-driven. The result, he hopes, is a more engaged and active classroom dynamic for all involved.
Modeling effective, ethical AI use
Having accepted that AI will be a part of their lives, Antino Kim's students still struggle to understand where and how to use it, and where and how not to use it. Recognizing this, Kim decided to add a generative AI-driven assignment to his project management course.
Part 1 of the assignment:
- Acting as consultants, the students conduct a client interview.
- They use the transcript of that conversation to develop formal project requirements.
- They draw up a project plan manually (without AI) to ensure they know how.
Part 2 of the assignment:
- Students craft a generative AI prompt for use with the interview transcript.
- The students run their prompt and evaluate the results: How well did the generative AI tool model the use case, identifying all the actors and writing out the requirements?
- The students continually refine the prompt to see if they can get the AI tool to deliver something as good as or better than what they produced in Part 1.
By incorporating AI literacy into his courses, Kim hopes to teach his students what AI is good at, where it might fail, and how to make responsible decisions about using AI. In other words, he hopes to model accountability in the use of AI.