In education particularly, heavy competition in the ed-tech industry has yielded more new tools and data collection than government agencies have been able to conduct oversight for — an average of 2,591 ed-tech tools per district, according to a 2023 report from the software company Instructure. Meanwhile, under-resourced school districts have been scrambling to establish best practices for the massive task of vetting all these tools, a workload that didn’t exist even 15 years ago.
Nelson, CITE’s director of resource and service programs, told a full room of school officials Tuesday at the Future of Education Technology conference in Orlando to make sure the burden doesn’t fall on one person.
“Who’s the sheriff in your district? It shouldn’t just be one person. It should be, ‘We created a process. Everybody’s the sheriff over their own little town, over their own little piece of this pie,’” she said. “That’s not how it’s working, and that’s the goal.”
In a two-hour session focused on best practices for evaluating ed-tech apps, Nelson said before using any vendor’s software, a school should have a rigorous vetting process to evaluate the tool for both educational effectiveness and safety.
To guide that process, she offered CITE’s 10-step rubric for creating an app-vetting team:
- Start by learning about data privacy laws, including those uniquely applicable in one’s own state.
- Educate stakeholders, including parents and families, the school board and community members.
- Build a “privacy team” of technical, business and education leaders.
- Create an inventory of tools being used in the district.
- Have the privacy team review privacy practices and find gaps in policies.
- Develop a Local Education Agency plan for privacy.
- Train educators and staff on it.
- Look at what other districts are using. For examples, see the California Student Privacy Alliance database.
- Contact vendors and have them sign privacy agreements. Download the National Data Privacy Agreement for each vendor, and piggyback off other district agreements with an Exhibit E addendum — a general offer of terms for using that tool that can be shared among school districts.
- Share privacy practices and agreements with families and other districts.
“[W]hen we sign an originating … contract, the first time we’re having the vendors sign, we say, ‘OK, and there’s this one page called Exhibit E, the general offer of terms, where you also agree [to allow] any district who is procuring your product and wants to sign onto this in our state ... to [do so], freely, without having future conversations if they agree to the very same terms in this contract,’” she said. “We got to a point where we won’t even work with a vendor if they say no. … A lot of time and money is going into this process, and if all of you [in the room] were my districts, each one of you don’t need to spend that time, money and energy.”
Erin Clancy, a CITE senior contract specialist, said the advent of artificial intelligence tools has complicated the negotiation process somewhat, as privacy laws do not specifically mention the technology. However, she said all ed-tech tools have to comply with the Family Educational Rights and Privacy Act (FERPA), and any vendor that tries to shy away from those obligations should not be trusted.
Nelson offered that one way to stop vendors from using legal grey areas around AI to dodge data privacy requirements is to write a definition of student data into the contract that includes anything a student inputs into an AI tool. Under FERPA, she said, parents and students would have ownership of that data.
“We talk to a lot of vendors who say, ‘AI is not covered in your agreement.’ Oh yes it is, because it’s student data,” she said.
Negotiating data privacy agreements with larger companies can be difficult, Nelson added, but CITE has negotiated them with Microsoft, Google and Adobe which are publicly available as models for others to use.