IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

How is ChatGPT Transforming K-12 Education in Colorado?

Teachers are redesigning assignments, administrators are revisiting policies, and students are still finding their footing as they navigate the new frontier of yet another disruptive technology.

A teacher stands at the front of a classroom and gestures at a white board
English teacher Amber Wilson explains an essay assignment for the final exam of the semester to 11th grade students at Thomas Jefferson High School in Denver on Thursday, May 2, 2024.
Hyoung Chang/The Denver Post/TNS
(TNS) — Amber Wilson is rethinking how her classroom at Denver’s Thomas Jefferson High School will operate next year, in large part because of the artificial intelligence program ChatGPT.

Her English courses will feature more in-class assignments and less homework. Heeding the College Board’s Advanced Placement AI guidance, Wilson will ask students to defend their thinking orally. And the 24-year veteran teacher is considering using programs to monitor her students’ online whereabouts on school-issued devices.

Wilson admitted to sporting rose-colored glasses about ChatGPT — a chatbot that uses generative AI to create unique text responses to users’ prompts — when it debuted at the end of 2022.

“We can’t stick our heads in the sand about it,” she said in a recent interview. “It’s here. We’re going to have to figure out how we’re going to use it.”

A year and a half into ChatGPT’s life, the AI program’s impacts on education in Colorado and beyond are irrefutable. Conversations about artificial intelligence in the classroom have dominated conferences and training sessions for schools and universities across the state. Local and national technology and education experts are forming a new, statewide steering committee to explore the future of AI in Colorado education.

In this early Wild West era, decisions about ChatGPT’s usage in the classroom largely depend on who’s in front of the whiteboard and who’s in administrative leadership. That’s left educators and students alike still finding their footing as they navigate the new frontier of yet another disruptive technology feared by some and exalted by others.

Last year, Wilson told The Denver Post that she wanted to incorporate ChatGPT into lessons to teach students appropriate ways to use the free, accessible software, such as for brainstorming, researching and outlining purposes — a starting point from which they could add their own knowledge and creativity.

Denver Public Schools, however, has tried to limit access to ChatGPT in the classroom by blocking students from using the program with their district email addresses, said Billy Sayers, the district’s director of STEAM, or science, technology, engineering, arts and mathematics.

The school district does not want teachers using ChatGPT in educational instruction, either, due to student data-privacy concerns, he said. (The Denver Post and seven other newspapers last week sued ChatGPT maker OpenAI, alleging the company illegally harvested copyrighted articles to create the generative AI program.)

Colorado’s largest school district is looking into programs that would establish guardrails and address safety concerns around AI usage, Sayers said, acknowledging that students do need to be prepared to encounter AI in their post-graduation lives.

Despite those constraints, ChatGPT still loomed large in Wilson’s classroom this year.

Her students’ work often didn’t read like the high schoolers she knew, she said. AI plagiarism trackers confirmed her fears: Students were using ChatGPT to write their assignments. Soon, students grew wiser about covering their tracks, she said. The plagiarism was still apparent, but near-impossible to prove.

“I don’t have 20 to 30 minutes to spend on every piece of writing to determine if they’re thinking or not,” Wilson said. “Now I have to come up with new ways to work around this, and I like being innovative and we need to be as teachers. But it does feel exhausting.”

In light of generative AI — which draws from information gathered from the Internet to create text responses, images and sounds based on users’ requests — Wilson said it’s time teachers rethink their assignments altogether.

“If the question is so easily answered by AI, then maybe we allow the students to use AI to learn that,” she said, “and then figure out what product we want from them.”

TEACHING TEACHERS ABOUT GENERATIVE AI


Matthew Farber, an associate professor of educational technology at the University of Northern Colorado in Greeley, taught middle school in the early 2000s. He remembers the moral panic over the specter of Wikipedia and Google eroding education.

“I’ve always been an early adopter to bring these things in and integrate it into best practices in the classroom,” he said.

Now, Farber teaches teachers how to do the same. And he incorporates generative AI into his coursework. His students can use the technology — with guardrails. They shouldn’t be copy-and-pasting wholesale from ChatGPT, he said, but they can use the program for inspiration.

One of his assignments asks students to engineer ChatGPT prompts to have the program produce a children’s story.

A student might ask ChatGPT to write a bedtime story about a puppy going to Jupiter in the style of Ernest Hemingway, Farber said. Then, students analyze the stories and tweak their prompts to make the stories better — more inclusive, enlightening, immersive. Not only do students learn how to play with generative AI, but they learn what makes a good story, he said.

“It’s this idea of refining prompts so students become critical thinkers of what AI is giving you,” Farber said. “It’s kind of a newer version of how we taught Google search or using physical encyclopedias when I grew up.”

Not teaching these skills, Farber said, or blocking the programs on school-issued devices creates a technological equity problem. Students who rely on school computers won’t develop the same skills as their peers who can go home and tinker.

“By not teaching how to use these technologies, you’re not preparing children for the jobs of the future,” Farber said. “That’s not to say everybody needs to be a prompt engineer, but it would be like graduating in 2010 and not understanding how to do a proper Google search.”

Farber sees innumerable benefits for students and educators from using generative AI in the classroom. ChatGPT can rework a lesson plan for a different grade level, Farber said, or assist English language learners in their academic pursuits.

Farber, like Wilson, believes generative AI is pushing educators to reconsider how they’ve always done things and lean into innovation.

“There are teachers in the educational technology communities who will say if the assignment is something that ChatGPT can do in 20 seconds, maybe it’s not a good assignment,” Farber said.

Diana Wagener, a language arts teacher at Battle Mountain High School in Eagle County, said when planning a lesson, she thinks about how students might use AI appropriately and inappropriately so she can guide them in what they’re allowed to do.

“We are working toward lessons that allow critical thinking to continue to happen while establishing which tasks can take advantage of the AI tools available — this is the new normal,” Wagener said. “I believe the future of writing will be individuals managing generative AI to help them produce clearer, more meaningful prose and help eliminate students getting stuck in the process of writing it down correctly.”

CHATGPT AND CHEATING


For educators to feel comfortable deploying newer learning methods, Farber said school leadership must offer meaningful professional development.

That’s where Molly Jameson, the interim director of the University of Northern Colorado’s Center for the Enhancement of Teaching and Learning, comes in.

Faculty members come to the center for one-on-one professional development, webinars and workshops to learn to become better educators. The Greeley institution has held six trainings around AI in the classroom this year and plans to make it a focus next year, too, Jameson said.

At UNC, instructors decide the rules around ChatGPT in their classrooms.

Jameson helps them craft instructions in their syllabi depending on whether they want to forbid all generative AI usage in their classrooms, encourage it with parameters or if land somewhere in between.

She’s heartened by how UNC educators use generative AI — from having students use image-creating AI to envision a sustainable future to tasking AI to help students narrow down research paper ideas.

“People get worried and stressed out every time a new technology comes around,” Jameson said. “Not that that is invalid. Plagiarism has increased since the pandemic. My philosophy is, instead of fighting against it, let’s figure out how to effectively use it because that’s the direction the world is going.”

It’s difficult to demonstrate the impact ChatGPT has had on plagiarism in schools.

At UNC, 34 cases of academic dishonesty were reported in the 2021-2022 academic year. The next year, 17 cases were reported and, as of April 19, 37 cases were reported this academic year. Just under half of the academic misconduct incidents reported this year at UNC involved alleged misuse of AI or ChatGPT, said Deanna Herbert, a campus spokesperson.

At the University of Colorado Boulder, cheating and plagiarism infractions skyrocketed during the 2020-2021 school year, when students were sent home to learn remotely during the pandemic. During the 2019-2020 academic year, the university recorded 454 violations, but the first pandemic year brought 817.

“The increase in violations during the pandemic was a national trend among institutions of higher education,” said Nicole Mueksch, a CU Boulder spokesperson. “The numbers have since returned to pre-pandemic levels.”

In 2021-2022, violations at CU Boulder fell to 437 and in the 2022-2023 school year — when Chat GPT surfaced — they dropped even further to 293. CU does not track how many of these instances are AI-related.

“While this may seem like an interesting inverse correlation, CU Boulder has made a concerted effort to communicate and educate on expectations around the use of AI with faculty and students,” Mueksch said. “Faculty are encouraged to inform their students of the standards and expectations that are specific to their course; and all students enrolled in a class or classes areresponsible for knowing and adhering to the Honor Code, which includes plagiarism with the use of paper writing services and technology.”

These numbers only reflect reported incidents.

One student at Metropolitan State University of Denver said they and every student they know use ChatGPT in some form.

The student, who spoke to The Post on condition of anonymity out of fear of academic retaliation, said the philosophy of teaching students responsible, resourceful ways to use AI hasn’t yet filtered into any of their courses. Their professors have denounced all ChatGPT usage, threatening repercussions like expulsion, they said.

But for this student, ChatGPT has been a helpful tool they said they used, for example, to better understand confusing texts from ancient philosophers. Even though they attend classes, do the readings, ask questions and visit office hours, sometimes a deadline is staring them in the face.

So they turn to generative AI for support when Aristotle becomes incomprehensible and ask ChatGPT to explain things in laypersons’ terms.

“Girl, respectfully, these people are speaking in B.C. Klingon,” the MSU Denver student said.

The student said they wished some of their professors were open to exploring how ChatGPT could be used as a tool for learning, just like other once-feared technologies, including calculators, computers and the Internet.

That’s the route Vilja Hulden, a teaching associate professor in CU Boulder’s history department, has taken.

Instead of policing ChatGPT, Hulden said she hopes to drive home a desire to learn for the sake of learning. She has assigned work that allows students to tinker with the AI program and see what it’s capable of — and what its limitations are.

“The longer I’ve been at this job, the more jaded I’ve gotten at policing plagiarism,” Hulden said. “Nobody is going to die because history students cheated on their essay.”

©2024 MediaNews Group, Inc. Distributed by Tribune Content Agency, LLC.