That was the position of Douglas Kiang, a computer science teacher from Menlo School in California, in a talk Tuesday at the International Society for Technology in Education’s annual conference in Denver.
As an example of how teachers could use it themselves, Kiang had ChatGPT devise a lesson plan for teaching students about irony using a famous song by Alanis Morissette. ChatGPT came up with objectives, materials and activities, all of which Kiang found useful. But it didn’t define the various types of irony, which he had to do himself, and the quiz it generated contained errors. He said this reinforced the point that teachers must closely inspect an AI’s work before bringing it to class.
Further, Kiang said “computer science is changing,” and the versatility of new tools is putting a greater onus on the user — teachers and students, in cases of AI in the classroom — to know how to wield them effectively.
“It used to be that we would design an algorithm or a rule set to help the computer differentiate these two things, but now we might configure a neural network that learns on the fly,” he said. “What’s the missing piece in AI? It’s judgment. Because sometimes it makes valid connections, but we don’t necessarily even know if those are the right connections or even the appropriate connections.”
Kiang said in order to craft an effective prompt, students must be specific and think about what they want. He likened the process to crafting “mini assignments” for AIs the way teachers would craft assignments for students, trying to avoid misinterpretation. He said students don’t have to get it right the first time — it can be a dialogue — but specificity is key.
In cases of errors or undesired responses, Kiang recommended having students explain why ChatGPT’s answer was wrong.
“Students still tend to ask questions of ChatGPT the way that they would query Google, and that’s not the most effective way,” he said. “They need to be trained and given guidance on how to ask the right leading questions to get the information that they want.”
Explaining AI’s use in terms of content versus skills, Kiang said AI is great at providing the former, but students need the latter in order to apply it. He used the analogy of the difference between cooks, who know recipes, which are essentially “content,” and chefs who have the skills to create them.
“I’m trying to train cooks, but for the purpose of creating chefs,” he said. “I think that’s another way that we can think about changing kids’ relationship with AI with assignments, is by giving them a purpose that’s larger than themselves.”
In other words, he said, give students an assignment they care about and they won’t trust ChatGPT to do it for them.
Once students understand their own role as well as AI’s, he said, the technology has the potential to reach its highest purpose.
“What if everybody had [a personal tutor], but it was an AI? Its training data is essentially all the work that you’ve done, and it knows where kids get tripped up, and it potentially can suggest things,” he said. “Even if it never interacts with you as the teacher, it can suggest things to the student, or suggest or explain things in a way that it knows the student will understand. That, I think, makes me hopeful.”