Unlike most presidential elections, when ballots are tallied and counted in a majority of precincts by midnight on Election Day and news outlets are able to project a winner before you go to bed, this November’s election is likely to be different. Because of a surge in mail-in ballots caused by people’s reluctance to physically go to the polls with COVID-19 still coursing through the population, results are likely to be delayed.
Mail-in ballots take longer to collect and count, and that is likely to cause delays in counting and in reporting of results.
That interregnum of 11 weeks could also be rife with disinformation coming from all directions as criminal hackers, enemy states and even domestic political forces try to shape people’s perceptions of what actually happened. Lawsuits are also likely to proliferate if the outcome is not clear.
Lawmakers and disinformation experts say during that period social media companies must be prepared to confront the likely onslaught of lies and misleading information.
States holding presidential primaries held in the past few months already have seen reporting of results delayed because of an increase in voting by mail. Most states stipulate that counting of mailed ballots cannot begin until Election Day even if they are received days and weeks earlier.
Such delays in November and potential lawsuits challenging results in some states could leave an information vacuum that could be filled with disinformation, said Graham Brookie, director and managing editor of the Atlantic Council’s Digital Forensic Research Lab, which tracks online disinformation.
“It’s inevitable that we’ll have convoluted election results and it’s inevitable there’ll be a period of time when results are confusing, and it’s not clear what’s going on,” Brookie said. From social media executives to federal, state and local officials, everyone needs to be prepared for “what could happen so they can avoid the information vacuum that’s extremely vulnerable to disinformation.”
Brookie previously served as an adviser on strategic communications in the Obama White House.
Rep. Eric Swalwell, D-Calif., who serves on the House Intelligence Committee and was a Democratic presidential contender, also is worried about a potential information vacuum during the 78 days of transition. That period could be filled by disinformation from not only Russia, China and Iran, but from President Donald Trump himself.
“My concern is that … we don’t really have to use our imagination, but look at what has happened in the last few years on Russian disinformation campaigns, with the president welcoming it and expanding it,” Swalwell said in an interview. “I’m seeing a situation where the president does not concede defeat, challenges the results in courts, and Russian actors amplifying that on social media with fake news stories about mail-in ballots, and creating confusion about the legitimacy of results. That’s my fear.”
If he loses, it is likely that Trump would seize on any disinformation from abroad to cast doubts on the results, Swalwell said. “We need social media companies to be prepared for that and counter that.”
In October 2016, then-candidate Trump told The New York Times that he might not accept the results if his Democratic opponent Hillary Clinton won, because the election may have been rigged.
Brookie said that Trump has been seeding that ground of doubt for years, stoking fears about election outcomes, alleging non-existent voter fraud and criticizing vote by mail, although Trump, Vice President Mike Pence and several Republican lawmakers have repeatedly voted by mail.
Going back to 2016, Trump has been engaging in “a form of disinformation that’s hedging bets,” Brookie said. It’s a “disinformation insurance policy that he can point to when he doesn’t get the results he wants, and this could be that election.”
Republicans in Congress, reluctant to dispute Trump, have been largely silent on what could happen after Election Day. But Tom Ridge, the first secretary of the Department of Homeland Security and a former Republican governor of Pennsylvania, has criticized Trump’s comments on mail-in voting, as well.
“I think it’s very sad and very disappointing that with almost five months to go, the president seems to (want to) try to delegitimize the November 3 election,” Ridge told National Public Radio last month. “It just seems to me that this may be an indication he’s more worried about the outcome than he’s worried about fraud.”
At a recent hearing held by the House Intelligence Committee, Swalwell asked top executives from Facebook, Twitter and Google about how they were preparing for the “most perilous times” between Election Day, which is Nov. 3, and Inauguration Day, Jan. 20, 2021.
Nathaniel Gleicher, head of cybersecurity for Facebook, told the panel that the company was “running red-team exercises and threat ideations within the company and with colleagues outside the company to ask when are the periods of greatest risk, what are the most likely threats.”
Red-team exercises refer to war-gaming drills used by military forces to prepare for a variety of battlefield scenarios. Red teams are made up of friendly forces who pretend to be the enemy to simulate real time combat.
Gleicher said that Facebook is aware that “the period after the election is a critical one” because of “some particular characteristics with this election, given that we expect an increase, for example, in vote by mail ballots.”
Spokespersons for Facebook did not respond to questions seeking more details on how the company was going about its red-team exercises.
Facebook has been under intense pressure to crack down on Trump’s violent comments such as when he said during the Black Lives Matter demonstrations, “when the looting starts the shooting starts.”
Although Facebook CEO Mark Zuckerberg has declined to silence Trump’s racist and violent language, the company recently banned a Trump campaign ad that used a Nazi symbol to refer to the so-called antifa group.
At Twitter, the company will enforce its rules about fake accounts and disinformation uniformly before and after Election Day, Nick Pickles, the company’s director of public policy and strategy, told the House Intelligence Committee. “We’ll take action on foreign actors and we’ll take action on domestic actors,” he said.
In May, Twitter attached warning labels and a fact-check to Trump’s tweets falsely claiming that mail-in ballots would lead to a “rigged” election. The company has labeled other Trump tweets as manipulated media.
Twitter also has entered into partnerships with government agencies, civil society groups and others to identify disinformation threats and to plan how to mitigate them, Jessica Herrera-Flanigan, the company’s vice president for public policy, said in an email.
Google spokeswoman Riva Sciuto pointed to the company’s Threat Analysis Group, which tracks disinformation from 270 different sources arising from 50 countries. The group removes content from Google’s platforms including YouTube if users are seen to be engaged in “coordinated influence operations,” according to Google.
The National Association of Secretaries of State, which represents state officials responsible for conducting elections across 8,800 jurisdictions in the country, said it has been working with Facebook, Twitter and Google to address election-related misinformation and disinformation on their platforms. The group has launched a website called TrustedInfo 2020 to educate voters about where to get accurate information on polling places, voting times and results.
In addition to that, individual states also are “conducting outreach efforts before, during and after the general election to educate their voters on accurate, state-specific information,” Maria Benson, a spokeswoman for the association, said.
©2020 CQ-Roll Call, Inc. Distributed by Tribune Content Agency, LLC.