Tech
Censorship accusations loom over Big Tech hearing on election threats
Top policy executives from Meta, Microsoft, and Google testified before the Senate Intelligence Committee on Wednesday about what they’re doing to protect US voters from foreign election threats in 2024. But the elephant in the room was the pressure campaign social media companies have faced from the right to take a more hands-off approach when it comes to labeling or demoting misinformation.
With 48 days until the US presidential election, Committee Chair Mark Warner (D-VA) called in tech companies to discuss the threats they’re seeing so far and how they’re responding. Warner took pains to emphasize that his chief concern was with foreign malicious activity, not domestic — seemingly in an effort to find common ground with his Republican colleagues. He stressed the bipartisan interest in preserving election integrity, pointing to bipartisan funding for election-related upgrades and election-related AI deepfake laws that have passed in both red and blue states.
But Vice Chair Marco Rubio (R-FL) said the issue of foreign influence online is “complicated” by the fact that foreign agents often seek to amplify preexisting views that Americans hold. He worried that taking down those who amplify legitimate American viewpoints casts a stigma on people who truly hold those beliefs. Rubio pointed to the lab leak theory, for example, a minority hypothesis about the origin of the virus that causes covid-19. The theory was widely panned by the scientific community in the early days of the pandemic but came to be taken more seriously with more time and information, even if it wasn’t wholly adopted.
Foreign actors are often amplifying views real Americans hold — just very fringe ones
Warner said he agreed with Rubio that Americans can say whatever they want “no matter how crazy.” But, he said, “there’s a difference when foreign intelligence services cherry-pick and amplify it.”
The line between unconstitutional government coercion and permissible pressure was at issue in a recent Supreme Court case, Murthy v. Missouri. Republican attorneys general had accused the Biden administration of coercing tech platforms to remove or demote speech like covid-19 disinformation, leading to temporary restrictions on the White House’s communication with tech platforms.
The Supreme Court decided the AGs didn’t have standing and questioned whether companies were really responding to government pressure, and its decision cleared the path for the government to communicate with tech companies about misinformation and other election threats. Warner told reporters after the hearing that communication between the government and tech companies is already “much better.” But he lamented during the hearing that “we are less safe today because many of those independent academic reviewers have been litigated, bullied or chased out of the marketplace,” referring to institutions like the Stanford Internet Observatory.
In their responses to lawmakers, the tech executives were mindful of the political minefield they faced. Microsoft president Brad Smith, for example, said that two key principles in the approach to election threats should be preserving the right to free expression and defending the public from the deceptive tactics of foreign nation-states. Meta president of global affairs Nick Clegg told Rubio that when it came to their handling of covid content, “I think we learned our lesson” that when governments exert pressure, they need to act “independently.” Google president of global affairs Kent Walker told Sen. Tom Cotton (R-AR) that the company kept the controversial New York Post story about Hunter Biden’s laptop up after an independent investigation.
Meanwhile, the executives feared the true challenges of this election cycle are still to come. Several executives and lawmakers acknowledged there hasn’t yet been the major AI bombshell many anticipated, but they predicted the days right before and after the election will prove to be the biggest test of foreign influence protections. Speaking to reporters after the hearing, Warner said he doubts the “visibility” the public has into the robustness of corporate trust and safety teams, amid earlier reports about waves of layoffs that impacted them. And looking to the days and hours right after polls close, Warner said, “presuming it’s a close election, what a vulnerable time.”