The Temperature of the Room

By Lew Ludwig

So, have you had that conversation with your students yet? You know, the one where you discuss your AI policy and explain the reasons behind it? It's a chat more and more educators are having. Gone are the days when banning AI outright seemed feasible; crafting "AI-proof" assignments now feels like a frustrating game of whack-a-mole. Moreover, our students are destined to navigate an AI-saturated landscape—they need our guidance.

Caveat One: You get what you pay for.

Consider this: a colleague recently boasted that ChatGPT 3.5 stumbled on their abstract algebra assignments. Sure. 3.5 is the free, entry-level model. It's crucial to recognize that ChatGPT 4o, at $20 a month, is significantly more adept, especially with mathematical reasoning. If you're testing your assignments, make sure you're using the most capable tools. And your prompting skills—do they extend beyond simply pasting a question into the AI interface? More skilled prompters will dialogue with the AI, asking it leading questions to build it up to do harder questions (like my student did in this post).

Caveat two: Detection software is fraught with problems.

Numerous studies have highlighted the unreliability of AI detectors, which often flag false positives, potentially damaging the trust between educators and students. Even seasoned linguistics have a hard time detecting AI-generated content from human writing. Moreover, a clever prompter can ‘dumb down their AI-assisted writing,’ with something like “write this at the level of someone new to proof-writing,” and the AI will happily comply.

Getting the conversation started.

So, where do we go from here? Let's circle back to that important conversation with your students. As mathematicians, we may joke about being introverts who'd rather talk to our shoes than to someone else’s face, but this conversation is critical. I suggest starting by assessing the room's temperature with an AI acceptance scale. While some good generic scales exist, I like to use a course-specific model that you can adapt from Ryan Watkins' work.

Suppose you assign a common related rates problem involving a 15-foot ladder sliding down a wall. Here's how you might structure the scale, starting from basic concepts and progressing to a fully solved problem, using AI as a teaching and learning aid. How much AI involvement would you permit your students?

  1. Concept Clarification: Defines related rates and their real-world applications.

  2. Problem Understanding: Clarifies the setup and movement dynamics (ladder against wall).

  3. Identifying Relevant Equations: Identifies the Pythagorean theorem as key to solving.

  4. Setting up Derivatives: Guides in setting up the necessary derivatives using implicit differentiation.

  5. Symbolic Differentiation: Assists with the required differentiation steps.

  6. Equation Solving Guidance: Helps solve for dydt\frac{dy}{dt}dtdy​ given dxdt=−14\frac{dx}{dt} = -\frac{1}{4}dtdx​=−41​ ft/sec.

  7. Analyzing Results: Interprets the rate at which the ladder’s top moves.

  8. Error Checking: Checks calculations for typical errors.

  9. Drafting Problem Solutions: Aids in drafting a complete solution from start to finish.

  10. Complete Problem Solving: Provides the final rate at which the top of the ladder moves up after 12 seconds.

Once you’ve landed somewhere, 1-3, 1-4, etc., have your students do the same with the list, ideally anonymously. This exercise not only provides a baseline for further discussions but also helps students articulate their comfort levels with AI assistance. Don’t like my example? Feed my example scale into AI, then ask the AI to generate a different scale based on your homework exercise.

But what prompts should students use?

If you are newer to AI prompting and wonder what prompts students would need for each level of AI usage from your scale, ask AI to generate a list of prompts. Here is an example prompt: “Given this scale, what prompt would a student use at each level?” Here is the AI response for item 1, Concept Clarification:

"Can you explain related rates and how they apply to real-world problems like moving objects?"

Have an honest dialogue with your students.

As we probe the evolving landscape of AI in education, it's clear that open dialogues with our students about AI policies are more crucial than ever. By understanding both AI's capabilities and pitfalls, we can better prepare our students for the realities they will face. I encourage you to initiate these important conversations to gauge and guide your students' understanding and use of AI. Engage with them to understand their comfort levels and help them develop the critical thinking skills needed to use AI responsibly. Why not start today? Ask your students where they stand, involve them in creating meaningful, AI-informed solutions, and navigate this new, unexpected educational challenge of generative AI.


Author’s Note

Frequent readers may have noticed a shift in the column since the spring semester. I am sharing less about my experiences in the classroom and more about resources or strategies you can use in your classroom. With this in mind, I will end each column with three important resources I hope you find useful. Please feel free to share your favorite resources via MAA Connect.

Three useful things:

1) Co-Intelligence: Living and Working with AI by Ethan Moloick (April 2024).

If I could recommend only one book on AI, this would be it. It provides a terrific overview of the development, applications, and potential of generative AI. After recommending it to my college president, he purchased copies for his entire senior staff.

2) A Student Guide to Navigating College in the AI Era from Elon University and the ACC&U (August 2024).

Though written for students, this guide serves as an excellent resource for faculty and administrators, offering practical advice on using AI responsibly while enhancing learning. It comes in an easy-to-read infographic format.

3) What Does It Really Mean to Learn? (August 2024).

Are you concerned that generative AI will “take over”? This insightful piece explores the concept of “educability”—a uniquely human trait—and provides hope for our role in this new AI-shaped terrain. This piece will be of particular interest to those in the tradition of the liberal arts.


Lew Ludwig is a professor of mathematics and the Director of the Center for Learning and Teaching at Denison University. An active member of the MAA, he recently served on the project team for the MAA Instructional Practices Guide and was the creator and senior editor of the MAA’s former Teaching Tidbits blog.