IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Educators Roundtable: Demystify AI With Transparency and Practice

In a virtual panel discussion Friday, several professors shared their experiences with having students use generative AI for writing assignments and recommended that students be allowed to learn by trial and error.

AI writing,Robot,Hand,Writing,The,Book,ChatGPT
Shutterstock
Although artificial intelligence brings an element of uncertainty to classrooms, some university educators are challenging their peers to “live in that gray area” and bring their students along under the guiding lights of transparency, autonomy and open conversations.

“Transparency is the solution,” said Shelley Rodrigo, an associate professor at University of Arizona. “Talk with the students. Instead of just mandating yes or no, ask them why they think it should be allowed.”

Rodrigo was one of five higher education professionals who spoke Friday during a virtual panel discussion, “In the Weeds: An Educators' Roundtable on AI," sponsored by ed-tech company PowerNotes. PowerNotes produces digital tools for students and educators focused on responsible use of generative AI for writing papers. Ahead of the discussion, audience members asked for guidance in allowing students to use ChatGPT and other AI tools that assist with assignments.

Jason Gulya, a professor at Berkeley College and chair of the school’s AI council, said instead of assigning his students to write papers in the traditional manner, he assigns them “mega prompts” on complicated topics for use with ChatGPT. He also requires them to provide him their ongoing dialogs with the tool.

“They create another person who is going to poke holes in their argument,” Gulya said. “This way, I learn so much more about how my students think.”

Catrina Mitchum, an adjunct professor at University of Maryland’s global campus, pointed out that generative AI tools are difficult, if not impossible, to incorporate into pre-designed online courses.

“They [learning management systems] just are not set up for it,” she said.

Lance Cummings, an associate professor at University of North Carolina Wilmington, said a great way to incorporate AI into teaching writing is to assign a large volume of essays where use of AI is allowed. This creates more opportunities to learn by trial and error. As he sees it, plagiarism only applies if the student’s intent is to deceive, and occurrences of such are also teaching moments that should not derail a student’s academic career. This method is an alternative to limiting the class requirements to one or two heavily weighted term papers in one semester.

“Moments where they can explore and learn that it’s OK to fail — we need more of those moments in the classroom,” he said.

Laura Dumin, a professor at the University of Central Oklahoma, echoed Cumming’s sentiments, noting that generative AI can be a partner in the quest to inspire a love of lifelong learning. She allows the use of ChaptGPT and has only witnessed one incident of cheating or plagiarism in the past nine months. Still, she cautions, in the spirit of “process over product,” educators must be flexible and allow students the autonomy to choose among many different generative AI tools.

And everyone in the higher education community, Dumin added, must understand that, while some college majors may disappear in the years to come, knowing how to partner with AI will be a valuable skill across all academic disciplines and vocations.

“Being able to structure your thoughts for a machine is going to be important for all fields,” she said.
Aaron Gifford is a former staff writer for the Center for Digital Education.