Social media platforms such as Facebook, TikTok and Twitter say they are taking steps to prevent the spread of disinformation about voting and elections ahead of next month’s midterms.
A Facebook search for the words “voting fraud” first brought up an article alleging that staff at the Children’s Museum of Pennsylvania were brainwashing children into accepting stolen elections.
Facebook’s second suggestion? A link to an article from a site called MAGA Underground that says Democrats plan to rig next month’s midterms. “You should still be upset about the fraud that happened in 2020,” the article claims.
Less than three weeks before the polls close, misinformation about voting and the election is rife on social media, despite pledges from tech companies to tackle a problem blamed for growing polarization and mistrust.
While platforms such as Twitter, TikTok, Facebook and YouTube say they have expanded their work to detect and stop harmful claims that could suppress votes or even lead to violent confrontations, a review of some sites shows they are still struggling to catch up. 2020, when then-President Donald Trump’s lies about the election he lost to Joe Biden helped fuel an uprising in the US Capitol.
“You’d think they would have learned by now,” said Heidi Beirich, founder of the Global Project Against Hate and Extremism and a member of a group called the Real Facebook Oversight Board, which has criticized the platform’s efforts. “This is not their first election. This should have been addressed before Trump lost in 2020. The damage is pretty deep at this point.”
If these US-based tech giants can’t properly prepare for US elections, how can anyone expect them to handle overseas elections, Beirich said.
Mentions of “stolen elections” and “voter fraud” have soared in recent months and are now two of the three most popular terms included in discussions of this year’s election, according to an analysis of social media, online and broadcast content conducted by the media. news company Zignal Labs on behalf of The Associated Press.
On Twitter, Zignal’s analysis found that tweets fueling conspiracy theories about the upcoming election were reposted numerous times, along with posts repeating debunked claims about the 2020 election.
Most major platforms have announced steps to curb disinformation about voting and elections, including labels, warnings and changes to systems that automatically recommend certain content. Users who consistently violate the rules may be suspended. The platforms have also partnered with fact-checking organizations and news outlets, such as AP, which is part of Meta’s fact-checking program.
“Our teams continue to closely monitor the medium term and work to quickly remove content that violates our policies,” YouTube said in a statement. “We will be vigilant before, during and after Election Day.”
Meta, the owner of Facebook and Instagram, announced this week that it has reopened its Election Command Center, which oversees efforts to combat election disinformation in real time. The company rejected criticism that it was not doing enough and denied reports that it had cut election-focused staff.
“We are investing a significant amount of resources, with work involving more than 40 teams and hundreds of people,” Meta said in an emailed statement to the AP.
The platform also said that starting this week, anyone who searches Facebook using election-related keywords, including “election fraud,” will automatically see a pop-up window with links to trusted voting resources.
TikTok created a voting center earlier this year to help US voters learn how to register to vote and who is on their ballots. Information is offered in English, Spanish and more than 45 other languages. The platform, now a leading source of information for young voters, also labels misleading content.
“Providing access to authoritative information is an important part of our overall strategy against election disinformation,” the company said of its efforts to prepare for the midterms.
But policies meant to stop harmful election misinformation aren’t always consistently enforced. False claims can often be buried deep in the comments section, for example, where they can leave an impression on other users.
A report released last month from New York University blamed Meta, Twitter, TikTok and YouTube for amplifying Trump’s false statements about the 2020 election. The study cited inconsistent rules regarding misinformation as well as poor enforcement.
A number of groups, alarmed by the amount of misinformation about voting and elections, have called on tech companies to do more.
“Americans deserve more from platforms than gimmicks and half-measures,” said Yosef Getachew, director of Common Cause’s media and democracy program. “These platforms have been weaponized by the enemies of democracy, foreign and domestic.”
Election disinformation is even more widespread on smaller platforms popular with some conservatives and far-right groups such as Gab, Gettr and TruthSocial, Trump’s own platform. But these sites have a small audience compared to Facebook, YouTube or TikTok.
Beirich’s group, the Real Facebook Oversight Board, created a list of seven recommendations for Meta aimed at curbing the spread of misinformation ahead of the election. They included changes to the platform that would promote content from legitimate news outlets over party pages that often spread disinformation, as well as more attention to voter-targeted disinformation in Spanish and other languages.
Meta told the AP that it has expanded its fact-checking network since 2020 and now has twice as many Spanish-language fact-checkers. The company also launched a fact-checking tip line in Spanish on WhatsApp, another platform it owns.
Much of the disinformation aimed at non-English speakers appears to be aimed at suppressing their vote, said Brenda Victoria Castillo, CEO of the National Hispanic Media Coalition, who said the efforts of Facebook and other platforms are out of step with the scale of the problem. they represent misinformation.
“We were lied to and discouraged from exercising our right to vote,” Castillo said. “And people in power, people like (Meta CEO) Mark Zuckerberg, do very little while they profit from misinformation.”