IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Professors Find AI Most Useful for Lesson Plans, Discussions

Now quite acquainted with generative AI tools, educators at several U.S. universities have found them most helpful for guiding class discussions, fleshing out lesson plans and teaching about AI as an emerging technology.

Aerial view of a laptop sitting on a wooden surface with a quiz on the screen. Resting on the laptop is a tablet with the quiz question entered into the ChatGPT web page.
Shutterstock
As generative artificial intelligence (GAI) programs have continued to advance in the year since the first public launch of ChatGPT, college professors across the U.S. are finding them particularly useful for lesson planning, guiding classroom discussions, and demonstrating their own uses and limitations.

According to a recent news release from Purdue University, humanities and social sciences professor Stuart Collins at Purdue Global University has been using generative AI tools since December 2022 to prompt classroom discussions in his American government and civics courses, as well as in his research. As one recent example, Collins, also a member of Purdue Global's AI Task Force, used ChatGPT to highlight historical information about the U.S. debt ceiling amid 2023’s debt ceiling negotiations.

“I realized that 80-90 percent of the assignments I had created for my government and civics courses could now be readily answered by AI,” Collins said in a public statement. “That realization presented quite the problem. We are now thinking about how we can make our courses both AI-amplified and AI-resilient.”

According to the news release, a recent Purdue Global survey of about 400 students and 100 faculty members found that less than 25 percent of students planned to use AI for their work, and they expressed fears about being accused of plagiarism when they do. However, the report noted that nearly 75 percent of faculty believed students would use AI to complete their schoolwork and said educators need to find ways to leverage AI for lesson planning, as well as to teach students how to make the most of generative AI tools to avoid academic dishonesty and build AI literacy.

When using AI tools like ChatGPT in classroom discussions and lessons, Collins said it’s important to remind students to double-check and verify information from generative AI. He said using generative AI in lessons serves two major purposes: prompting discussions on class topics and information literacy, and teaching students how to make use of AI tools responsibly and ethically as part of their studies.

“We can go, ‘Why did ChatGPT give us this answer?' At this point, even though my students are just taking a required course in American government as one of their core requirements, they are now also becoming data analysts,” he said. “AI can help summarize, explain and discern this content. And then, with that data, you can then have a more robust conversation.”

Kyle Korman, assistant computer science professor at Dakota State University, said that while he doesn’t heavily rely on tools like ChatGPT for his lessons, he has also used it for discussions and to teach students about the importance of fact-checking.

“I think [GAI tools] can be really handy on the educator side of things for multiple reasons, like lesson planning ... It's a good way to expand horizons," he said. “I've also recommended students to ask ChatGPT questions … Then obviously, probably follow up and confirm with other sources, because ChatGPT and other AI systems can make some things up.”

According to Lisa Blue, an instructional specialist at Eastern Kentucky University (EKU), professors there are slowly making more use of tools like ChatGPT and Bard, Google’s conversational generative AI chatbot, for similar functions. Blue said she and other educators have also made limited use of more instructor-specific AI programs such as Gamma for creating presentations and guiding lessons. In addition, she said, EKU is looking to develop its own chatbot for simple student questions, similar to Georgia State University’s, which she noted helped to increase student retention there.

“There’s a lot of interest on our campus in developing an EKU-specific chatbot to answer student questions that is driven by AI,” she said, noting that these chatbots have canned responses and are trained on more limited data sets than tools like ChatGPT.

She said that while some faculty members have embraced AI’s potential in the classroom, more efforts are needed to train students and faculty on how to use it effectively. For the most part, she said, tools like ChatGPT are useful when instructors are brainstorming for lesson ideas.

“My hopes are that, as we educate the faculty on what generative AI is and what the capabilities are, that they can use it to streamline their administrative workload but also use it to refresh their lessons, their curriculum and their planning. During our recent generative AI pre-semester symposium, I demonstrated for instructors how you could [use GAI tools] to write the learning objectives for a General Chemistry I course," she said. "The GAI spits out a whole list of learning objectives and then you can actually look at those objectives. For me, it spit it out for an entire two-semester general chemistry course.”

At Oregon State University, administrators recently established the Higher Education AI Task Force (HEAT) to provide more oversight and guidance on the use of AI in higher education as it applies to research and teaching, according to Regan Gurung, an associate vice provost and executive director of the Center for Teaching and Learning and professor of psychological science who chairs the new task force.

In his own general psychology courses, Gurung said he’s given students instructions on how to use generative AI tools like ChatGPT on some assignments if they choose to do so, adding that students must note when they’ve used it and verify information for accuracy.

“If you choose to use it, it has to be clearly acknowledged in your paper, and you have to make sure that you are editing it,” he said. “It’s basically acknowledging the fact that there’s this tool out there that may help you get started, but remember to consciously work with it to do what you’re doing.”

Like Collins and Korman, Gurung and Blue both stressed the importance of reminding students to fact-check answers from tools like ChatGPT. Blue noted that recent research from Stanford University and the University of California at Berkeley suggests that answers from ChatGPT 3.5 and 4 are sometimes less than reliable in subjects like physics and math.

“When you really push either one of those GAIs to answer questions in mathematics or physics, in particular, the accuracy of responses has gone down over the last four months,” she said.

Despite the current limits of GAI tools, Collins said he believes they will play an increasingly important role in higher-ed curriculum and lesson planning, classroom instruction and research. He added that as AI advances, its applications across industries will become more ubiquitous, making AI literacy a must for students across disciplines.

“If you're not using AI in your professional workspace, you're behind the curve,” he said.

Editor's Note: A previous version of this story identified Stuart Collins as chair of Purdue Global's AI Task Force. He is a member, but he is not the chair.
Brandon Paykamian is a former staff writer for the Center for Digital Education.