Preparing Our Classrooms for AI This Fall

By Lew Ludwig

As we approach August, my thoughts inevitably turn to classes and syllabi. However, this year feels different. I'm determined to be more intentional about incorporating generative AI into my teaching. I'm guided by two key principles: first, more students are likely to use generative AI, and second, graduates who cannot think, write, and work with AI will face significant disadvantages in the workforce.

With this in mind, I want to share and discuss a six-step AI policy template from Teaching with AI by Bowen and Watson.

  1. When is AI use permitted or forbidden? Why?

As we've acknowledged, students are likely to use AI. The key question is, how much should we permit? Can a student ask AI why they got a quiz question wrong and use it to create similar practice problems for next time? Can students use AI for homework? Can they discuss homework with classmates? My approach is to assume that any work done outside the classroom will involve access to generative AI. Rather than focusing on catching rule breakers, I aim to level the playing field as much as possible by allowing total access to AI. It's important to note, however, that equity issues remain, particularly concerning the varying levels of AI access among students in terms of free vs. paid versions.

2. If AI is allowed, must students share their AI prompts with you as part of assignment submission?

Okay, suppose you embrace AI and allow its use. Will you require documentation of the prompts students use? For me, this is a slippery slope leading into the rabbit hole of AI detection. If students are using AI “nefariously,” they can easily share prompts that seem less nefarious. Moreover, due to the inherent randomness of AI, verifying the validity of shared prompts is challenging, as the responses can vary significantly, even with the same input. As such, I will not be asking for AI prompts—unless, of course, one is particularly brilliant, in which case, I might just ask to learn a new trick myself!

3. How should AI use be credited?

While I will not require students to share their prompts, I will ask them to give credit when they use AI, especially in responses to open-ended questions. For example, around week three or four of the semester I like to ask this question:

What are your  goals for your class participation grade? Are you satisfied with earning a ‘C’ with regular attendance and focus, or are you aiming for something higher? If you are aiming for something higher, please share three goals you have in mind that go beyond class attendance and participation.

Even without AI detection software, it is very clear when students just copy and paste this prompt into AI. The responses tend to be bland and lack substance. While these questions may not heavily impact their grade, I do expect students to give them some thought. It's frustrating to read a generic AI-generated response.

4. A warning about the limits of AI.

We all have our limits, and it's crucial to be transparent about when AI use is allowed and when it is not. For instance, this fall I will teach calculus, and as noted above, students are allowed full access to AI outside of class. The structure of this class will involve weekly quizzes based on homework assigned the previous week. Homework is to be completed outside of class. On quiz day, students will spend the first 25 minutes discussing the homework in groups, followed by a 25-minute individual quiz that tests their understanding of the homework material. The final grade comprises one-third homework (based on effort) and two-thirds quiz (based on correctness). This approach highlights a vital lesson: relying solely on AI to tackle homework doesn't effectively prepare students for the quizzes. Additionally, since implementing this process, my grading workload has decreased since I only need to check homework for completion.

5. Transparency regarding your own planned usage of AI detection tools and how that information will be used.

I make it clear to my students that I use AI extensively. For instance, if I struggle with drafting a paragraph, I use AI to refine my sentence structure and word choice. In class, I employ AI to generate "clicker-type" questions on the spot or to create engaging examples. Last semester, my students particularly enjoyed examples involving the velocity and acceleration of Aang from the anime series "The Last Airbender," as opposed to the typical stone-thrown-on-the-moon scenarios.

However, when it comes to communication, I am transparent that I use AI solely as a spell checker in emails and other direct communications. I never allow AI to generate entire emails to students on my behalf. After all, we need to keep humans in the loop.

6. Clear rules about students’ ultimate accountability for work.

At the very least, if students choose to use AI, they need to be responsible for the final product. It's essential that they understand not just how to use AI, but also its limitations and potential biases. Students must critically evaluate the information generated by AI, recognizing that it can perpetuate existing biases or plain make things up. They need to be prepared to stand behind their work, ensuring that they can defend and justify their submissions as their own reasoned conclusions, not just the output of an AI tool. This accountability fosters a deeper level of engagement and understanding, ensuring that the use of AI enhances rather than diminishes their educational experience.

As we head into this semester, it is essential to foster open dialogues with our students about the role and implications of AI in education. By establishing clear guidelines and ensuring students understand both the potential and the limitations of AI, we can better prepare them for a future where technology and human ingenuity intersect. While AI offers significant opportunities to enhance learning, our approach must be thoughtful and deliberate, ensuring it serves as a tool that complements the educational journey. Embracing AI with cautious optimism, we can look forward to a future where technology not only aids learning but also empowers our students to thrive in an increasingly digital world.


Lew Ludwig is a professor of mathematics and the Director of the Center for Learning and Teaching at Denison University. An active member of the MAA, he recently served on the project team for the MAA Instructional Practices Guide and was the creator and senior editor of the MAA’s former Teaching Tidbits blog.