Tech
US election prompts cities to get a grip on fake news | Context
What’s the context?
On elections, climate, migration and more, local officials are increasingly compelled to fight disinformation
- Issue is new focus for local authorities
- Pandemic was turning point
- U.S. election compelling local action
Carey L. Biron
WASHINGTON – Running U.S. elections has always been a complicated job for local officials, requiring the corralling of hundreds of volunteers, staying on top of ever-changing legal requirements, and now also combatting misinformation and disinformation.
Running elections “has become one of the most high-profile responsibilities that county government does,” said Jennifer Liewer, deputy elections director for communications for Maricopa County in Arizona.
Maricopa, one of the country’s most populous voting jurisdictions, has been a hotbed of electoral scrutiny in recent years, weathering 50 lawsuits since the contested 2020 election, Liewer said.
“There have been a lot of allegations, false information and narratives that aren’t factually accurate that we have had to combat,” she said.
For Maricopa County, this has meant new staff, extensive fact-checking efforts, online cameras in tabulation centres – even an “official ballot” mascot that attends local professional basketball games.
“Our office has been working toward improving and communicating in a manner that we haven’t done before,” Liewer said. “That’s probably being seen around the country.”
Such work is part of a deeper trend that has emerged since the pandemic, with local officials increasingly forced to address false information about public health, migration, and urban planning strategies.
False information has not only changed every aspect of election administration, but local officials often bear the brunt of it given their visibility in their communities, said Amy Cohen, executive director of the National Association of State Election Directors.
“False information is one of the greatest challenges going into November,” she said, referring to the U.S. national elections on Nov. 5.
“That is the thing that keeps a lot of us up at night, because you can’t predict the narrative that will take off.”
Trusted messengers
Across the globe, local officials are a key, untapped resource in addressing rising false information, said Paul Costello, a senior manager with the German Marshall Fund’s cities programme.
Previously, cities did not tend to address false information as a topic itself, said Costello, who in recent months has been talking with local officials for a new disinformation response “playbook”, released last week.
“I had an increasing number of city officials coming to me and saying, ‘What can we do?'” said Ika Trijsburg, a researcher with the Melbourne Centre for
Cities in Australia, which led on the playbook’s development. Disinformation, she said, was blocking policymaking and even prompting threats.
In the British capital London, for instance, policy discussion over an ultra-low emission traffic zone last year sparked a wave of disinformation over migration, diversity and other issues.
In the Australian city of Onkaparinga, a proposed climate emergency declaration fanned social media outrage, protests and eventually the evacuation of local officials.
At the local level, false information tends to deal with public health, sustainability, migration or sexual diversity, Trijsburg said.
But elections can affect all of these areas, Costello warned. “It’s an opportunity for disinformation actors to supercharge what’s happening and to get a lot more traction,” she said.
While cities do not have the intelligence operations or other tools available to national governments to fight back, local officials see the impacts more closely and have unique access to organisations such as schools or sports clubs that can be used to counter false information.
Increasingly, that work also takes place online.
In 2021, San Jose in California partnered with local online influencers to address false information regarding vaccinations, masks and other urgent pandemic concerns – particularly among groups that had long distrusted local government.
“We had to rebuild trust simultaneously as we asked these groups to transact on very serious action,” recalled Andy Lutzky, former communications chief for the city.
For months, nearly 50 local “trusted messengers” created hundreds of social media posts in multiple languages, work that Lutzky said helped drive higher vaccination rates, particularly among marginalised communities.
Lutzky now works with Xomad, the company that organised that work.
The firm started focusing on such public campaigns in 2019 and has since worked with numerous cities, as well as more than 16 state governments in the past year alone, said the company’s founder and CEO, Rob Perry.
“My vision for cities around the world in the future is that each will have their trusted army of local messengers,” he said.
Sarina Alavi, a 25-year-old psychology PhD student in New York, worked with Xomad on a state campaign around substance use disorder, an issue she said was rife with false information online.
“It can be truly infuriating, especially when individuals are posing as experts without the proper licensure or certifications,” she said.
Alavi’s posts targeted false information on, for instance, how easy it is to ascertain the presence of a dangerous drug – “you can never really tell when a substance is laced and deadly #CanYouTell?” – and on the number of teens and adults with substance use disorder.
Her posts received more than 20,000 views, said Alavi, who is continuing to engage on similar projects.
This is “the future of government communications,” said Xomad’s Perry.
“We’re three to four years away from pretty much every state and many cities having a line item” for such work.
‘Buck stops with county clerks’
Some local officials are using the current election season to test new tools to combat false information.
In June, a fake video purported to show Utah Governor Spencer Cox admitting to fraudulently collecting ballot signatures, leading Utah County Commissioner Amelia Powers Gardner to release a public warning.
She also brought together academics and a local company to test out a “digital identity” programme aimed at helping candidates combat AI-created “deepfake” videos or audio recordings.
“For a long time the deepfake-generation platforms weren’t convincing enough to dupe anyone. We’re getting to the point now where they’re starting to cross that threshold, particularly for audio deepfakes,” said Brandon Amacher, director of the Emerging Tech Policy Lab at Utah Valley University, which is involved in the project.
Several campaigns are currently in talks with project organisers to use the verification programme, which will run through January.
“We’re starting to really see the potential dangers of this,” he said.
“The buck stops with county clerks and commissioners on election security – that’s their primary concern.”
(Reporting by Carey L. Biron; edited by Jon Hemming)