IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

CITE23: How to Start an AI Task Force at Your School

As artificial intelligence ushers in a sea change that touches all aspects of education, schools might keep up by convening a council of stakeholders to discuss good ideas and get district-level buy-in.

Illustrations in blue and white feature human silhouettes holding hands against a white background. The white silhouettes say "Ai" on them in red letters.
Shutterstock
SACRAMENTO — An emerging consensus among school technologists is that generative artificial intelligence (GenAI) is irrepressible, so the process of embracing it has to start somewhere. One approach that has made progress at La Cañada Unified School District (LCUSD) in California: forming a task force of stakeholders to deal with emerging technology.

Explaining her task force to a room of her peers on Thursday at the California IT in Education Conference in Sacramento, LCUSD’s Associate Superintendent of Technology Services Jamie Lewsadder said the idea was to have an open conversation about the district’s position and eventually workshop safe and ethical guidelines for using GenAI, what kids need to know about it and what the respective responsibilities should be for students, teachers and parents. To do this, she wanted to avail herself of other people’s thoughts and expertise — not just faculty but parents, students and community members.

“I put out a note to our families and said, ‘I’m looking for parents who want to be involved in an emerging tech council.’ I purposely did not name it ‘AI task force,’ I named it an emerging tech council, with that concept that it’s going to be changing, so whatever comes next, this group will be in place,” she said. “I’m telling all of my teams that there’s an asterisk by everything we do. Just be ready, nothing’s set in stone. Just due to the nature of the change, we have to just continue to be ready and hold onto the roller coaster … It’s appropriate to be skeptical, terrified and feeling the awe at the same time.”

Lewsadder said the response was massive and varied, including worried moms, a father who got his Ph.D. in artificial intelligence 30 years ago, and even a ninth grader with autism. In addition to getting the ball rolling on answers to looming questions, she said it also helped get a hesitant district leadership on board with GenAI.

“The emerging tech council helped a lot, because one of the parents said — and I can quote because it’s the bumper sticker for our task force — ‘We’re all in. Let’s get started,’ and that was the takeaway,” she said. “Again, with safety, with guardrails, all those pieces, but we needed somebody on the outside, and that influence really impacted the room.”

Lewsadder cited special education as one department that might be a key entry point for discussion or open minds about new technology in some districts.

“I know in our district, our special ed kids were the first ones to have iPads, and they were the kids that looked a little bit different because they were the only ones in the room with tech,” she said. “I worry about teachers removing tech to deal with AI, and now suddenly our special ed kids who don’t have tech written into their IEPs (individualized education programs) anymore are losing an accommodation, so that’s what I’ve been telling teachers, is, ‘You can’t go back to paper.’”

Lewsadder said she also created an AI Slack channel in her tech team’s workspace so anyone interested could post information, continue learning and invite teachers.

“In technology, we’ve learned that you don’t have to know all the answers, you have to know how to get them,” she said. “If we can get that message to our educators, they will feel more confident with technology tools, so that’s something else to think about.”

Several useful pieces of information emerged, including that many people lack general knowledge about data privacy with GenAI. Lewsadder said many students were shocked to learn that if they had entered their resumes into ChatGPT, they gave away that personal information. This prompted her to send emails to families and encourage regular communication about these issues among staff.

“It’s a quick message. I think you just do PA announcements, you send it to the parents, the teachers can say these things — it doesn’t have to be a revolutionary change, but getting the message out about the risk is really important for the kids,” she said.

Jennette Vanderpool, an education strategist with the ed-tech service provider CDW Education, said another reason to maintain open conversation among faculty is that they use different tools, and some have better content restrictions than others. For example, ChatGPT will give a user the recipe for a Molotov cocktail, while Merlyn Mind will not.

“We’re really waiting for Google and Microsoft to get on board with the educational safety component,” she said.

Vanderpool added that teachers need to know these things so they can set their own personal classroom rules accordingly — defining which tools they want to use, amending their syllabus language with basic guidelines, and teaching students APA and MLA citation rules for generative AI.

“We can’t tell teachers how to teach, we can’t tell them how to grade, but if we create policy around AI usage, then it can be on the teacher’s plate to define how specifically they want it written in their syllabi,” she said.

Paradoxically, for all its risks, another key hurdle in adopting GenAI is the student tendency to view it through a negative lens, which won’t be helpful to them entering a workforce that has integrated with the technology.

“The kids don’t really see it as something they’re supposed to be doing, so they have this negative connotation, and I think that’s one thing we all really have to fight to break,” Lewsadder said.
Andrew Westrope is managing editor of the Center for Digital Education. Before that, he was a staff writer for Government Technology, and previously was a reporter and editor at community newspapers. He has a bachelor’s degree in physiology from Michigan State University and lives in Northern California.