The electoral landscape has essentially changed, and a newly published white paper outlines the resultant dangers, while also making suggestions for state and local governments on how to take preventative action against potential hackers and bad actors.
Securing America’s Elections, published by Stanford University’s Cyber Policy Center, suggests that more needs to be done to protect electoral infrastructure at all levels of government, while also asking the question: how likely is it that foreign powers — like Russia — will attempt another large-scale intervention in United States elections?
Andrew J. Grotto, one of the co-authors of the paper, is in a position to know, having spent a year and a half working as the senior director for cybersecurity at the U.S. National Security Council. Grotto, who also served for several years on the Senate Select Committee on Intelligence, said that the kind of disinformation and hacking perpetrated by Russia is one of the biggest threats to the integrity of elections and the democratic process.
“Every government on earth either tries or wishes it could influence politics here, because we’re a superpower. That’s just the nature of things,” Grotto said, in an interview with Government Technology. “What made the Russian interference malicious, where they crossed a line, is that they, in many cases, deployed false or misleading information. They lied. They hid their identities, so that they weren’t straight with their audiences about what the author of a message was,” he said.
Russia’s attack was two-pronged — deploying disinformation to sway voters towards a Donald Trump victory, while at the same time hacking into various Democratic institutions and releasing damaging documents to smear Hillary Clinton's image.
In the face of the potential for both disinformation and hacking, the report offers a number of potential solutions to bolster state and local defenses — including requiring that all vote counting systems provide a voter-verified paper trail, and establishing basic digital security norms for campaign officials.
Even more fundamental than most of these, Grotto said, is the basic defense of adequate funding and resources.
“State and local budgets, going back to 2008 when the financial crisis hit, have really been under significant pressures and just haven’t had the money, on average, to invest in IT refresh,” he offered. “In addition to resources, there has to be a risk management plan that guides decisions about where to invest scarce resources, time and energy, given the threat landscape.”
For smaller, less secure communities, this can often be a challenge.
“In many jurisdictions around the country, it’s not like many small towns have a CIO or CISO,” Grotto said. “That person, if they exist, may have three or four other jobs. So their ability to manage risk effectively may be constrained by a lack of experience, expertise, or focus,” he said. “Part of the answer has to be providing more guidance on appropriate risk management strategies.”
Still, Grotto noted, the likelihood of actual infrastructure penetrations is typically slim, and a bigger concern are the types of disinformation that Russia spread. These kinds of manipulations — which consist often of online deception, fake profiles and micro-targeted ads — can be deployed in any election, local, state or federal, because of the global nature of the Internet.
The worry is that these kind of influence operations — which sowed confusion and helped delegitimize the democratic process — will become normalized and domesticated in the U.S., Grotto said.
“I think the Russians demonstrated a playbook that other actors — foreign and domestic — may find attractive to use,” he said. “That’s really worrying because if domestic actors are able to do this successfully it just makes it that much harder to identify the foreign aspect, and plus in general it’s just not healthy for democracy.”
The report makes a number of prescriptions for cutting down on disinformation, including strengthening tech companies’ ability to self-regulate internal solutions to fake ads and profiles, and supporting the passage of the Honest Ads Act — which would further regulate advertising in political campaigns.
Still, the kinds of technology that support disinformation are growing in prevalence, Grotto said.
“New technologies ... deepfakes, machine learning, artificial videos, means there will be new avenues for adversaries to mislead [voters] and create and spread false narratives,” he said.
Going forward, the goal should be to make sure that this type of interference can be handled adequately, that intrusions do not create a crisis of confidence in the nation's political system, and that attacks can always be handled with a course correct, he said.