“The bigger impact, in a lot of instances, is every time there is a data breach, there’s a continued erosion of trust by our constituents, the business community, those people that we serve as government agencies,” he said.
The problem presents a moving target, but efforts to improve data awareness and inventorying, incorporate encryption efforts into larger cybersecurity strategies and employ strong authorization and access controls can all make a difference, speakers said.
“The technology has improved, the tool sets available to local governments have improved … but the scope and scale of the problem is increasing,” Godsey said.
THE ATTACK SURFACE BROADENS
The public-sector push to modernization has been a mixed blessing, speakers said. The COVID-19 pandemic spurred agencies to embrace cloud. But this rush into hybrid environments also complicated data management and oversight, adding more locations in which data may be stored.
“With the adoption of new technologies, getting our arms around — from an asset management perspective — where that data is stored, what you have, and what the sensitivity of that data is, is a challenge,” Godsey said. “It’s kind of like, we’ve now taken our internal file storage, and we basically use the entirety of the Internet as a potential repository. So, you can imagine the size and the complexity of that problem.”
This hampers cybersecurity efforts, because governments cannot accurately evaluate their risks until they have a clear inventory of what they’re trying to protect.
Governments selecting cloud providers are also bringing more players into their operations, meaning that a cyber incident impacting one of those companies — or one of the companies’ own vendors — could impact the public-sector client, Godsey said. While government has always worked with vendors, moving to cloud makes it especially important to vet third-party and fourth-party risk.
He recommended agencies regard their enterprise assets as including their data, cloud providers and the firms on which those cloud providers depend.
Agencies also need to look internally, and ensure staff stay trained up about best practices around handling sensitive and non-sensitive data, said panelist Matthew Lamb, manager for Prisma Cloud Solutions Architects, focused on state, local and education entities, for cybersecurity firm Palo Alto Networks.
WORKING WITH ENCRYPTION
As governments adopt more cyber tools and techniques to protect their data, they also need to be attuned to the trade-offs involved and adjust their strategies accordingly.
Encrypting data — both in transit and at rest — means that any hackers who do manage to intercept it will have to find a way to undo the encryption before they can use it, noted Carmen Taglienti, principal cloud and AI architect for IT solutions provider Insight.
That’s a useful defensive measure, but one that agencies’ existing security approaches may not have accounted for.
Godsey said agencies’ cyber strategies often assume they can monitor the data flowing in and out of the enterprise, which would help tip them off to improper data exfiltration indicative of a breach. But this visibility is harder to achieve when agencies encrypt the data or hand off data storage to third-party clouds. That forces entities to update how they think about data security.
“The other thing that we are doing that’s problematic is, a lot of the tools that we employ from a cybersecurity perspective are such that it makes the assumption that you have insight and can see the data that’s coming into your environment and leaving the organization, regardless of whether it’s on-prem or is in the cloud,” Godsey said. “Our inspection and our ability to use tools to see if data is leaving inappropriately out of our environment may be inhibited by that encryption... and then with that data being stored in third-party cloud repositories, we may not have the same ability to apply the same tools and security controls in place.”
The process of encrypting data for protection and then decrypting it to use it takes time — one reason entities don’t just encrypt everything, Taglienti said. Agencies need to decide what level of encryption best suits their needs.
They’ll also need to ensure they can keep their encryption keys safe, and Taglienti recommended regularly cycling these keys to reduce risks that any hackers managing to obtain keys will be able to use them.
ACCESS AND AUDITS
To further reduce risk of breaches, agencies can carefully control who gets permission to access data in the first place. That means ensuring any person — or device — seeking access is authenticated first, and that only those users with legitimate needs are approved.
Entities might grant permissions based on users’ roles or they might grant authorization to users based on their having certain attributes. The latter is a more “refined” approach, and an increasingly popular one, Taglienti said.
Putting all this advice into practice can be tricky, and Godsey said agencies should start by assessing their current security postures to see where they could improve. Lamb also urged agencies to regularly audit to check how well they’re continuing to follow their strategies and policies.
Maricopa, for example, runs a self-assessment at least twice a year, and every few years secures a third party to analyze it. The latter is helpful on a practical level and an “organizational” one, by bringing an independent, outsiders’ perspective to the situation, Godsey said.
Regular checks also help agencies ensure they’re keeping their approaches updated as technologies evolve.
Trying to ramp up defenses may seem daunting, and organizations can only hope to reduce the likelihood and damages of breaches, not fully prevent them, Godsey said. Still, he urged agencies to recognize that any improvements they’re able to make, even small ones, put them on better footing.
“Tackle what you can,” he said.