Skip to Content, Navigation, or Footer.
The Eastern Echo Saturday, March 21, 2026 | Print Archive
The Eastern Echo

Eastern Michigan University professors weigh in on AI in classrooms

As generative artificial intelligence becomes more pervasive in academia, professors and faculty at Eastern Michigan University have different ways of navigating the new technology. EMU does not have a universal policy on AI usage in the classroom; it is up to the professors to decide what is best for their class.

Humanities disciplines

Laura McMahon, a professor of history and philosophy at EMU, finds AI use is more common in undergraduates than graduate students. AI tends to show up in essays and discussion posts. McMahon said the key tells were well-written yet generic answers, topics that weren’t brought up in class, and ideas a 200-level class student wouldn’t know yet.

“It’s kind of like how someone writes an encyclopedia,” McMahon said.

McMahon’s class policy is that AI should not be used in the class. Since AI usage cannot always be proven, McMahon designs the curriculum to be difficult if students are using AI. Grading requires students to demonstrate that they have been in class with specific points from lectures and readings. The homework is to prepare students for the exams.

“(The classroom structure) penalizes students for using AI without having to catch them,” McMahon said.

Sometimes when the cheating is blatantly obvious, McMahon will confront the student; occasionally the student will admit it and apologize. For those students, it was a one-time thing, used as a last resort because they ran out of time. In those cases, McMahon said the student self-corrects and doesn’t do it again. However, McMahon said most students who use AI are doing so constantly to just try to pass the class.

“There’s always students who just want to get through the class with a passing grade,” McMahon said.

Professor Steven Krause has a different approach to AI usage. Krause talks about AI in freshman composition classes, and the basic policy on usage is that simply copying and pasting from a generative AI program is cheating. AI can’t be an author; it has no presence or responsibility in the world, Krause said. Beyond that, Krause said it can be a good tool for brainstorming, researching and proofreading.

Krause doesn’t see cheating as much as his colleagues. This is partly because of the structure of the class, Krause said; there are no one-and-done essays and no quizzes or tests. It is also partly because he’s upfront about where AI works and where it doesn’t.

“The first thing I do to prevent it is talk about it,” Krause said.

However, cheating does still happen at times, more often in the freshman classes than the graduate classes, Krause said. When that does happen, Krause’s process is to have the student redo the assignment; if it happens a second time, the student fails.

Some keys for finding AI usage are style; after reading about a thousand pages of student writing every year for 36 years, Krause knows what student writing sounds like, he said. AI also uses words that most people don’t, such as “delve,” Krause said. The essays Krause assigns are done in pieces so that the process can be viewed. If a big block of text appears out of nowhere, it was probably copied and pasted from somewhere else, Krause said.

With all the advancements to the technology, Krause said STEM-type careers are going to be heavily impacted. AI will be a tool for logistics, shipping, architecture, finance and business fields, but it will not be as big in the humanities fields, Krause said. Humanities are going to become more important because AI doesn’t have the context to do what humans can do, Krause said.

Krause said this is a difficult time for students to navigate all the different rules and policies, as well as for educators. It’s going to take years to figure out what is and isn’t acceptable, Krause said.

Social sciences disciplines

Jeffrey Bernstein, a professor of political science, said it’s important that students proofread any work they generate from AI. Once a student puts their name on a paper, they take responsibility for any possible mistakes AI might make. Bernstein said students in his classes are encouraged to use AI as a tool. Although some assignments may explicitly prohibit AI, in those cases Bernstein explains why AI should not be used. Bernstein said students aren’t looking to cheat; they’re looking for guidance. Students want to deliver their best work; they’re going to gravitate toward a helping hand. It’s the professor’s job to say when to not use the tool.

“We, as faculty, have the space to learn how to support students,” Bernstein said.

Bernstein is also the director of the Faculty Development Center; the center works with professors to customize their AI policies. The rules and consequences of breaking the rules are up to the professor’s discretion, Bernstein said.

“We give a lot of space to faculty to make their own decisions,” Bernstein said.

Nick Romerhausen, a communications professor, knew as soon as ChatGPT came out in 2022 that it would have a big impact, he said. Romerhausen’s class AI policy is that no work expected to be a student’s original work can be AI. In his classes, there are discussions and open communication about AI’s ethical use. Romerhausen said there have been no issues of students trying to pass off AI as their original work.

“I work it through with my students individually,” Romerhausen said.

Romerhausen is no stranger to the conversation around AI. He teaches a communications and AI class. First taught in winter of 2024, the goal is for students to understand how AI impacts communication, Romerhausen said. The class talks about the potential utility and drawbacks of robots, chat bots and generative AI. Romerhausen said learning about AI is a conundrum; in order to teach people about it, they have to use it, and every time AI is used, every piece of information it is fed makes it smarter.

Romerhausen has also done research projects on AI. After feeding AI over 1,000 prompts and asking it to do very specific things, Romerhausen said it’s quite capable; one possible setback is that it’s not great at citing sources, although it is getting better.

STEM disciplines

Ourania Spantidi, a computer science professor, used to have an explicit no-AI policy. About a year ago, Spantidi shifted gears to encourage the use of AI with disclaimers and reflection. It’s not about fighting AI, but watching it unfold and incorporating it into the learning, Spantidi said. Spantidi said the technology is getting more helpful, more efficient and advanced; educators are just trying to keep up.

“We are learning alongside students,” Spantidi said.

Professor Andrew Ross created an AI in STEM fields course. The class aims to prepare students for working in STEM fields and using AI as a tool.

“I’m hoping the students will learn the AI skills they will be expected to have in the workplace,” Ross said.

The students will learn about both predictive and generative AI.

Arts disciplines

Ryan Molloy, a graphic design professor, has an AI policy that describes the technology as another tool for designers to use. Molloy said he encourages students to use AI and talk about how they used it and what they used it for.

Molloy said there are pros and cons to AI usage. The pros are that it is efficient and works much faster than people. The cons are the possible environmental impacts and the loss of the act of creation.

How students and professors can get involved

The Office of Campus and Community Writing works with students and faculty to discuss how to handle AI in classrooms. Ann Blakeslee, the Director of the Office of Campus and Community Writing said AI can be used, but there are a lot of concerns.

“We can’t ignore it; we can’t make headway like that,” Blakeslee said.

The ways to integrate AI into academia depend on the context, Blakeslee said. The university shouldn’t have a policy as a whole, but rather a statement about AI’s role.

The key to understanding the technology, Blakeslee said, is to understand how it’s used in specific professions, how to use it ethically and what its limitations are. A good trick to know is the generative AI sandwich. Start with a human-created outline or product, then consult AI, then evaluate, assess and tweak the output, Blakeslee said.

“I think it will be a more transformative technology,” Blakeslee said about whether AI will replace jobs.

Currently, the department is hosting meetings for professors to talk about concerns and how to address AI in classroom policies. To reach more educators, the university is going to host a generative AI summit in the spring. Students will be able to attend for free. There will be presenters speaking about AI concerns and how it’s changing to the academic world, Blakeslee said.