“I am very confident that we have done everything we can to make election infrastructure as secure and as resilient as possible. And we've been very clear that there is no information, credible or specific, about efforts to disrupt or compromise that election infrastructure,” said Cybersecurity and Infrastructure Security Agency (CISA) Director Jen Easterly during an Oct. 26 Center for Strategic and International Studies (CSIS) interview.
The 2020 election aftermath laid bare how misinformation and disinformation online can foment violent insurrection, and falsehoods have continued to spur threats of violence against election workers. Government officials of all levels and nonprofits have been working to dispel false claims and spread reliable information about tomorrow’s midterms.
Disinformation and misinformation activity is likely to spike after polls close, according to Bret Schafer, senior fellow and head of the information manipulate team for the German Marshall Fund’s Alliance for Securing Democracy.
“Election Day itself tends to be pretty quiet,” at least in terms of messaging from foreign sources, Schafer told GovTech. “The night of, and the following days, are when we start seeing things get a little bit wild.”
Since July, Schafer’s Midterm Monitor project has been tracking candidates’ social media messaging around voting and elections. For several years, he’s also monitored the narratives promoted by diplomats, state-owned news and other sources officially connected to Chinese, Iranian and Russian governments (this would not capture covert foreign influence campaigns).
POST-ELECTION RISKS
False information comes from a variety of sources, and Easterly said that Chinese, Iranian and Russian attempts to influence elections are “a significant concern.”
Candidates themselves can also be a notable source of inaccuracies. Midterm Monitor examined secretary of state candidates’ Facebook behaviors andfound that 63 of the 201 election-related posts they made between Sept. 1 and Sept. 27, 2022 (about 31 percent), advanced untrue narratives.
Post-election, Schafer sees two key risks. Tomorrow’s races include candidates who discount the 2020 election results, and these individuals are prone to push “problematic messaging” if they seem to be losing. Second, rumors are particularly likely to swirl as voters wait to hear final results.
“Absence of information is a very bad thing around elections, because it's very quickly filled by others, often with a very specific agenda,” Schafer said.
PRE-BUNKING THE COUNT
Differences among states’ election processes can lead to confusion, as voters wonder why they’re hearing results announced from one jurisdiction but not another. Proactively pushing out information about the differences and what voters can expect from their states can help dispel doubts, however, Schafer said.
Mail-in ballots are processed before being counted, and so it often takes longer to present results of these votes than those submitted in person. States policies impact those timelines and differ widely.
Election workers in 38 states can begin processing before Election Day, while those in Maryland must wait until 10 a.m. on the Thursday after the election, for example, according to the National Conference of State Legislatures (NCSL). And some states start counting mail-in ballots before Election Day, others on the morning of, and still others only after polls close.
Plus, Schafer said, Democrats have a higher tendency to vote by mail, which can cause significant differences between the unofficial election night results and the final tallies that include all votes. That can look suspicious to residents who are unaware of the reason for the shift.
The more election officials can push out clarifications and explanations about procedures and expectations, the better, he said. And, as with any operation, mishaps happen. When they do, jurisdictions need to be ready to provide explanations, fast.
“I think it does help … if there is a problem — and there's always some problem in every election — to have a really good communication plan in place to explain what's happened, why it happened, how it's being mitigated,” Schafer said.
Easterly spoke similarly during the CSIS event, saying the public needs to remember it can take days or weeks after Election Day to get final results. Election officials also asked her to spread the message that lack of a completely smooth Election Day is normal, not cause for alarm.
“There are going to be errors, there are going to be glitches — that happens in every election. But that's why there are multiple layers of security controls and resilience built into the system,” Easterly said. “Somebody will forget their key to the polling place; a water pipe will burst. These are normal things, they're not nefarious.”
NEW TRICKS VS DIGITAL LITERACY
New techniques are also emerging in the disinformation landscape. Deepfakes technology isn’t yet mature enough to cause significant impact but “cheap fakes,” or lightly altered, misleading videos are proliferating, said Alex Mahadevan, director of Poynter Institute’s MediaWise digital media literacy training program. These include videos clipped to lose the context, or ones with altered sounds and speed.
“Like a video of [John] Fetterman that has maybe been slowed down to make it sound like, he's talking slower than he is,” Mahadevan told GovTech. Concerns over Fetterman’s health following his stroke have reportedly impacted his campaign prospects. “Or maybe it's a video of a politician where somebody removed the applause, and so the video sounds very, very awkward.”
Fact-checkers can help call out tricks like these, but they cannot catch everything on all platforms, he noted. Digital media literacy trainings reach far, however, and prepare people to assess for themselves the trustworthiness of content they encounter, and MediaWise has been growing its efforts here in this and the coming year.
Helping residents become more misinformation savvy includes alerting them to the ways online information can be misleading and giving them a three-question approach to evaluating content.
“When you encounter something online that makes you feel emotional, or you hear something on the radio that makes you feel emotional, you need to ask who's behind the information? What's the evidence? And what are other sources saying?” Mahadevan explained.
His organization aims to help answer such questions by teaching skills like Google search tips and how to reverse image search to check if a photo might be manipulated or depicted out of context.
HOW MISINFORMATION TARGETS DIFFERENT COMMUNITIES
Efforts to combat misinformation can be tailored around how it is playing out in different communities. One demographic group may be more likely to see certain false narratives than another, and encounter them on different media channels.
Disinformation has been reaching South Florida’s Latino voters through radio channels with unreliable hosts, for example, Mahadeva said. Meanwhile, many Indian Americans have been encountering targeted misinformation on WhatsApp, including unfounded claims about particular U.S. candidates supporting or opposing Hindu nationalism.
Different channels come with different challenges: For instance, untrue content on direct messaging apps like WhatsApp isn’t visible to fact-checkers the way that Facebook and Twitter’s public posts are. Meanwhile, Facebook and Twitter have been called out for failing to put warning labels on Spanish-language falsehoods.
“Communities that are the targets of disinformation are reached in different ways with those falsehoods,” Mahadeva said. “They process it in different ways. And it affects how they vote, how they interact with the world, differently.”
In recognition of this, MediaWise recently started connecting with community figures like civic organizations, religious leaders and elected officials to learn about the specific falsehoods impacting their communities. MediaWise then provides these leaders with one-hour workshops tailored to address those concerns.
The workshops are an opportunity to introduce attendees to MediaWise’s tools for spotting and verifying potential misinformation, so they can then spread those resources out to the communities they serve. Some of these offerings include a texting-based misinformation course in English and a WhatsApp-based one in Spanish.
Individuals are better prepared if they know the kinds of falsehoods to expect and where they might encounter them.
Thus far, MediaWise has connected with groups serving Black, Latino and Asian American and Pacific Islander communities, as well as with AARP, and looks to provide misinformation toolkits to librarians next year.
Older adults, too, have their own needs. They tend to encounter falsehoods in the form of fraudulent Facebook posts and email scams. Preparing them against misinformation can also involve updating them on the latest ways technologies can be misleading, and a recent MediaWise session demonstrated how AI can generate fake images.