The need to for colleges and universities to make privacy a focus of their data initiatives was a major topic of discussion Tuesday at the annual Educause conference in Chicago, at a session featuring Joe Gridley, chief data privacy officer for the University of Maryland; Phil Reiter, associate director of privacy at the University of Illinois at Urbana Champaign; and Svetla Sytch, assistant director of privacy and IT policy at the University of Michigan. The panel, titled “From Theory to Practice: Weaving Privacy Considerations in Strategic Decisions,” focused largely on the privacy risks that have come with adopting new digital learning and data management tools, as well as with leveraging data in general.
According to the panelists, data privacy regulations and legislation in the U.S. generally leave much to be desired compared to those found in the European Union, for example, where they noted consumers have more protections when it comes to their personal data and more transparency about the ways in which apps and other tech tools log and use that data. The panel also noted that students today, who are digital natives, seem to have grown accustomed to having little privacy in the digital sphere, underscoring the need for institutions to teach "digital-privacy literacy" and promote best practices for students as part of privacy-centered initiatives.
“We really need to start getting in the space of thinking about how we engage our students effectively and have those conversations [about staying safe online],” Reiter said, adding that it’s largely up to institutions themselves to formulate their own data privacy policies and protect sensitive student information like medical records, for instance.
Panelists also referenced resources such as the recently revised Educause CPO primer, a report which has guidance on data privacy considerations, as potential frameworks for universities to use in crafting privacy policies. Among the report’s key points is a recommendation that institutions be transparent about how data might be used, so that students can make informed decisions about protecting their sensitive information. The report also recommends higher privacy standards and prioritizing students’ rights to privacy in general.
In addition, panelists discussed how the increased use of AI tools for instructional purposes and by students could present new data privacy concerns, due to how those tools aggregate data for content generation. Sytch said when it comes to data privacy and using data in general, the potential risks that come with the mass adoption of generative AI tools like ChatGPT still remain an elephant in the room at many schools.
“Early on, in the beginning of the year, we released guidelines around how to safely and appropriately use open-source artificial intelligence solutions like ChatGPT,” she said.