IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Opinion: Teach Human Intelligence Before Artificial Intelligence

Before students use AI tools to complete their work, they should first develop their own HI (human intelligence) and understand the purpose of education and the importance of ethical behavior and personal integrity.

An outline of a robotic head back-to-back with an outline of a human head. The outlines are in white against a black background.
Shutterstock
Artificial intelligence is exciting and scary at the same time. It is transforming our culture in many ways, predicted and unpredicted. Its impact on educators falls into three categories:

  • Use, both ethical and unethical, by students in our schools and their lives
  • Use by educators and administrators within schools
  • Discussions with students on current and future policies about the use of AI

My focus in this piece is student use of AI as a substitute for their own work and educational progress. My thesis is that before they use many AI tools to complete their work, they must first develop the HI (human intelligence) applicable to the topic and field of study. Along the way, we need to discuss ethical use of AI, because cheating has become easier, faster and less detectable. HI before AI.

I will limit my focus to AI grammar and writing tools as a subject-matter example. There’s no question that students need to learn how to communicate and get their point across in writing. As part of this, they must

  • show they can write clearly with the correct audience in mind,
  • be able to interest their reader,
  • get their point across with proper support and evidence, and
  • show their ability to develop a line of reasoning.

To achieve this, they must first understand and demonstrate use of correct grammar. When they can show mastery of proper syntax, grammar, punctuation, spelling, etc., we can discuss the legitimate and appropriate use of AI writing tools as an aid, similar to asking a friend or co-worker to read their work and check it for errors.

With those skills firmly grasped and demonstrated at the appropriate level, students have the knowledge to evaluate the grammar-checking tool’s advice, which has been shown to be imperfect, as with all forms of advice, human and AI-based. The problem is that grammar tools can now make general writing suggestions, and even do entire school assignments that can pass as original work, in seconds.

Misuse of AI tools deprives students of the fundamental goal of education: the ability to think for themselves! Without the star-high goal of learning to think for themselves and develop points of view, doing schoolwork is worthless and meaningless to them. It can lead to different forms of cheating. While cheating is not new, these new AI tools make it quicker to do and harder to detect. When we ask students to write, we want them to do their research and fact-checking before writing. We want them to use a variety of resources, to look for bias, and to detect unreliable sources of information. But getting students to do this is getting harder and harder.

Students could argue that because AI is already creating much of the content we read, see and hear (and it will get better and better), they won’t need this skill. Students could argue that none of these skills matter because of this transformation of the world of business and government, policy discussions, etc. Most educators disagree.

Students can argue it is easier and faster to have AI do their research and writing, because they have better things to do and they won’t need that skill later in life. Most educators disagree.

The lesson before the lesson — what students need to know before learning to use AI — is the purpose of education, and the importance of ethical behavior and personal integrity. AI tools can be enticing to students who don’t understand the true purpose of education and thinking for themselves. Just because AI tools can write an essay or report in seconds doesn’t mean they should be used to do that.

For educators, this is a wonderful opportunity to have discussions about ethics and the purpose of education. This is a hot topic, because many students are often just “doing school,” as discussed in Stanford professor Denise Pope’s book, Doing School: How We Are Creating a Generation of Stressed-Out, Materialistic, and Miseducated Students.

If students don’t understand the proper purpose of education, punishing them when they misuse AI is the only means of getting them to comply with standards for ethical use. That is not a good game to play in schools and has many negative aspects and consequences. Readers can consult their own experience to see how this might play out for many students.

Cheating is currently the norm for many students. Years before generative AI, a 2018 article in Edutopia noted that cases and data “confirm that academic dishonesty is rampant”:

“A 2012 Josephson Institute’s Center for Youth Ethics report revealed that more than half of high school students admitted to cheating on a test, while 74 percent reported copying their friends’ homework. And a survey of 70,000 high school students across the United States between 2002 and 2015 found that 58 percent had plagiarized papers, while 95 percent admitted to cheating in some capacity.”

Space does not permit a full discussion of this very important and very broad topic, but we know that students cheat when they see no purpose for the work they are being asked to do.

The existence of AI tools creates many opportunities for teachable moments and discussions of honesty and integrity. Learning to do research leads directly to discussion of media literacy and bias. It is so easy for students to cut and paste and then rewrite the work of others instead of expressing their thoughts. For the first time in history, students can access many primary sources of information but are only likely to do so if they see the purpose of putting in the time and effort, and then thinking for themselves about what they read.

Another important element is that students need meaningful assignments, not time-wasters and busywork. Nothing can push a student into unethical behavior and the use of AI faster than make-work assignments that have no relationship to what they need to know and to learn, or to their world.

My thesis is being challenged daily. In a recent blog post, Bill Gates takes up this topic in his discussion of Khan Academy founder Sal Khan’s new book Brave New Words, which I have just started reading, and it is already on my “highly recommended” reading list. I support his thesis that soon AI will produce many positive changes in all aspects of education. That doesn’t change how I feel teachers need to work with students today.

In the introduction, Khan expresses his concern that AI could be a crutch for students, “preventing them from developing their own research and writing skills.” He describes how AI could help improve the field of education in ways we can only imagine. Gates writes about using an AI tutor for thought-starters, getting feedback in seconds, and AI providing “detailed suggestions to refine your language and sharpen your points.”

On the matter of whether this is cheating, he says, “It’s a complicated question, and there’s no one-size-fits-all answer.” Khan and Gates agree that when used right, AI doesn’t work “for students but with them to move something forward that they might otherwise get stuck on.” But for many educators like me, using today’s AI tools correctly means using them ethically, and that is a hot discussion topic with students. We will have to take a good hard look at the next wave of tools.

I agree with Gates’ thought: “Mastery of AI won’t just be nice to have in a few years — for many professions, it’ll be necessary.” But just like babies who learn to crawl before they can walk, today’s students need to master basic skills before using AI tools to augment those skills. All of this may change when the new AI tools emerge with these concerns fully addressed. But for now, let’s ensure that students are learning to think for themselves and to communicate these thoughts on their own!
Mark Siegel is assistant head at Delphian School in Sheridan, Ore.