IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Privacy and Security Concerns Need Attention as IoT, Machine Learning Take Hold

Agencies need to consider when and how they keep public data secure and private, as vulnerable machine learning and Internet of Things devices come into wide use, state and local officials warn.

LOS ANGELES — Privacy concerns may make the mantle of big data weigh heavily on the shoulders of the public sector, but the underlying issues are the ones agencies must confront, city, county and state officials told hundreds of the roughly 450 registrants at the Los Angeles Digital Government Summit.

In opening remarks, a lunchtime address and a panel discussion of positive case studies on Tuesday, Aug. 29, Los Angeles city and county executives and a privacy leader from Washington state all reminded their audiences that the future of data privacy starts now — but privatizing and securing all that data will likely be a long-running process.

Washington state Chief Privacy Officer Alex Alben compared it to the emergence of last century’s disruptive technology, the automobile, but told those attending his lunchtime examination of global privacy concerns that people are typically eager to protect privacy.

“And the next question is what’s privacy,” he said.

“Technology is always going to be ahead of the law, so does that mean we shouldn’t do anything? People say if we have laws about this, it’s going to create a lot of conflict and confusion … and you’re eventually going to hamstring innovation,” Alben said, noting he believes that given the magnitude of the threat already posed to residents’ and agencies’ data, it is “long since time" to do something about protecting people’s data.

And this, said Los Angeles CIO Ted Ross, is the context of the summit.

"No pressure. Because we do carry a very heavy burden, a very heavy responsibility on ourselves being information technology (IT) professionals," said Ross, one of Government Technology's Top 25 Doers, Dreamers & Drivers of 2017, during his opening remarks.

In Alben's presentation, which elicited more than a few chuckles from those present, he took viewers through a slideshow that was sobering and humorous — but precisely cataloged the state of privacy today.

Miniaturization, cloud storage, the use of cameras, machine learning, artificial intelligence (AI) and data analytics are all initiatives with privacy concerns that agencies will face going forward, Alben said, pointing out it’s estimated that more than 5 million new devices are being connected to the Internet every day, contributing to a global online swell that could reach more than 21 billion by 2020.

City, county and state officials alike must be mindful of the threat posed by hackers, hacktivists and criminal organizations, Alben said, recounting the devastation wrought by the Mirai botnet attack.

Yet even as some officials have steeled themselves against incursions from the outside, he said they have in many cases failed to extend privacy rights — long not considered a right in American law — to data.

Challenges from Internet of Things (IoT) devices are among the biggest threats, Alben told the room, pointing out that while American law did eventually incorporate privacy in the late 19th century, it hasn’t kept pace with Samsung “smart” refrigerators and even the “flashlight” apps on many smartphones, which can have access to our contact lists.

Washington state has enacted a data privacy law that covers how the state will and won’t use its citizens’ data — but similar protections often don’t extend to the private sectors’ labyrinthine privacy policies, which consumers understandably are loath to read.

Tracking through Google maps, collecting biometric data, and questions of who can reasonably have the expectation of privacy will continue to resonate, Alben said.

“It’s leading to a lot of mistrust that carries out and carries over not only in the online corporate context, but in governance as well. If you don’t trust a lot of websites, why should you trust a government website?” Alben asked the room.

He advised those present to revisit their own agencies’ public records and privacy laws, to take special care with biometric data because of its literal human representation, and to loop in third parties that work for the public sector.

“Talk to your contractors about how they’re going to design some of these things, especially if they have to do with personal data,” Alben said.

Afternoon panelists in a discussion of “With Great Data Comes Great Responsibility” generally agreed.

Dave Wesolik, IT branch manager for the Los Angeles County department of internal services, told audience members the county is moving into IoT — but is mindful of its potential to be subverted.

“They can now attack and they can hack a car-manufacturing robot in a factory. We need to make sure we’re providing secure communication and our data is secure for that,” Wesolik said.

Mohammed Al Rawi, CIO for the Los Angeles County Department of Parks and Recreation, said his agency is deploying IoT partially in hopes it may change the perception of IT personnel from mere service providers.

Among its deployments, county parks and recreation is using soil and weather sensors in parks to more efficiently send out arborists — and even to spot forest fires.

“All this data gets gathered and put into the system. This can serve for early rescue and early control for the fire department. But also managers, arborists, maintenance workers will have the ability to look at the system and see what’s happening real time,” Al Rawi said.

But there’s also a dark side to data, said Hunter Owens, data scientist for the city of Los Angeles, recalling the ignoble fate of the Microsoft chatbot Tay -- which was designed to “chat” with Twitter users, but was quickly undermined last year by inappropriate content.

Owens debunked machine learning, translating its unique vocabulary, but challenged listeners to think about the problems they want to solve with artificial intelligence and machine learning — and to think for themselves.

“Machine learning does not mean computers are learning like people learn. At the end of the day, most of machine learning is still human learning,” Owens said, calling for an end to “black boxes” that obscure so-called proprietary developments from even the public agencies that purchase them.

“We have a responsibility to our constituents and our residents to be able to justify all the decisions we make,” Owens said.

After the panel, Owens told Government Technology that agencies starting to consider big data, machine learning and AI “need to be keenly aware of both the challenges and opportunities presented.”

“Yes, we can determine who is at risk of homelessness before they become homeless but we also run the chance of predicting longer prison sentences for those who may not be deserving of them,” Owens said. “You have to balance the desire to do good with the opportunity first, and be very clear in the trade-offs you’re making with your algorithm."

Ethics and responsibilities, he said, may well be “up to us” in the data science community.

Theo Douglas is news editor for Government Technology. He was previously assistant managing editor for Industry Insider — California. His reporting experience includes covering municipal, county and state governments, business and breaking news.