IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Purdue Leaves Generative AI Guidelines Up to Professors

The university's policy for spring 2024 is that instructors will use their own discretion and explain to students what constitutes unauthorized use of generative AI tools such as ChatGPT for coursework.

Image of a woman's hand activating an AI button
When it comes to defining how students can use generative AI tools like ChatGPT, Purdue University is leaving it up to individual professors — at least for now.

According to recent guidelines from the university, finalized for the spring 2024 semester by the university’s Office of the Vice Provost for Teaching and Learning, faculty will be able to choose whether their courses make use of generative AI tools, as well the extent to which students are allowed to use generative AI within their coursework. The guidelines say that while instructors are not required to use GenAI tools, their course policies must clearly state what they consider authorized or unauthorized use of AI, how instructional teams may attempt to detect the use of AI, and the consequences students will face for violating these policies.

According to Edward Berger, interim head of the Purdue School of Engineering Education and associate vice provost for learning innovation, these guidelines were the result of input from students, faculty, university senate committees and conversations on campus about AI, and the school will issue updates as the technology continues to change. He said that while concerns about students using AI tools to cheat on assignments still linger among some faculty, more and more are growing comfortable with the technology as it matures and as they use generative AI for generating course content, grading and other functions.

“I think as time goes by, people are becoming more comfortable with the ability to manage that situation,” Berger said. "At first it was a very scary situation … I think over time people have come to some better view of the academic integrity [concerns], but because Purdue is also a research institution, there's lots of people who, in their research, use AI for various things, and that has helped them become acculturated to what it looks like to use AI effectively, whether it's generative AI or some other kind of tool.”

According to the guidelines, faculty should use AI plagiarism detection tools with caution, as those like Turnitin, for instance, have high false-positive rates and should be treated with "extreme distrust" before further reviewing whether students violated policies around AI plagiarism.

Purdue is the latest of many universities to allow instructors to determine whether the use of GenAI tools is permissible and to what extent, and to encourage students and faculty to provide input on policies expected to evolve as the tech changes. With AI ed-tech regulations still being left up to individual institutions amid ongoing national conversations about who should be regulating AI, schools such as Syracuse University, the University of Texas-Austin and several others across the country have issued similar guidelines for faculty on the use of tools like ChatGPT, allowing for some degree of discretion about whether to experiment with such tools in their classrooms and courses. In addition, universities like the University of Kentucky have formed committees to address the ever-evolving issue of how to regulate the use of AI for education purposes.

According to a Purdue news release focused on guidelines for this spring, guidance is expected to change each semester as AI tools continue to advance. It added that the University Senate will also work with Vice Provost Jenna Rickus to assess the need for future updates to university policy.

In addition, the news release noted, university officials collaborated to fund work led by Lindsay Hamm, a teaching assistant professor in Purdue’s Department of Sociology and AI innovation fellow at Purdue’s Innovation Hub, where the university hopes to expand a network of staff across various disciplines who are integrating AI tools into their teaching to better understand how the technology will change how students learn.

Hamm said the Innovation Hub also devoted funds to AI in teaching and learning grants, which will allow faculty to experiment more with how to use AI. She noted that faculty have been meeting regularly to discuss how to approach the use of GenAI in higher education.

“As soon as OpenAI released ChatGPT in November of 2022, we knew this was going to be huge, and we immediately pivoted to 'How we can help professors understand what's happening with generative AI?' … All of these meetings for the last three semesters have been about generative AI, what it is, how your students perceive it, how they might be using it,” she said. “That’s where we really came together and said that we want to give a comprehensive but pretty simple and easy-to-digest set of guidelines.”
Brandon Paykamian is a former staff writer for the Center for Digital Education.