WASHINGTON — On Thursday, Free Press released Big Tech Backslide: How Social-Media Rollbacks Endanger Democracy Ahead of the 2024 Elections, a report documenting Meta, Twitter and YouTube’s retreat from accountability in the run-up to the 2024 elections. Big Tech Backslide analyzes a year of platform-policy rollbacks, layoffs and reinstatements of dangerous and extremist accounts to show how the major platforms are failing to protect democracy and public safety.
The report lays out specific ways that platforms are backsliding: Meta, Twitter and YouTube have removed 17 policies that had guarded against hate and disinformation, and collectively the companies have laid off more than 40,000 employees and contractors, with many of those job losses occurring on trust and safety teams. In the report’s conclusion, Free Press calls on the platforms to implement a detailed set of policies at the outset of the U.S. presidential primary season in February 2024.
Big Tech Backslide finds that Meta, Twitter and YouTube have removed policies that had protected users, while laying off content moderators tasked with flagging disinformation before it spreads too widely. Platforms covered in the report reinstated and monetized previously banned accounts, limited what fact checkers can do, and stopped moderating COVID and Big Lie content. Under Elon Musk’s erratic leadership, Twitter has been the worst offender.
Report author and Free Press Senior Counsel and Director of Digital Justice and Civil Rights Nora Benavidez said:
“The deluge of fake, hateful and violent content on social-media platforms is about to go from very bad to worse in 2024 — in ways that will throw our democracy and lives into further chaos. Big Tech executives like Elon Musk and Mark Zuckerberg have made reckless decisions to maximize their profits and minimize their accountability.
“Big Tech Backslide sounds several alarm bells as the platforms fail to address major threats to platform integrity ahead of next year’s elections. Those threats include: supercharged disinformation and fake news stories spreading faster than ever before; the proliferation of deepfakes and generative AI technology that mislead voters about candidates and their positions; a permissive approach to discriminatory and exploitative political advertising; the spread of content discouraging voting and other civic participation; and incorrect or false imagery about polling locations.
“The year of backsliding adds up: Meta, Twitter and YouTube have eliminated 17 policies that once protected users and dismissed more than 40,000 people, many of whom once moderated content and served in other key roles. The likes of Musk and Zuckerberg must fix these ill-considered policy and personnel reversals immediately. The only way they can save their platforms and safeguard democracy is to reinvest in user health and safety, reinstate disinformation policies, and develop and launch new practices and policies to safeguard platform integrity. Big Tech Backslide offers a way forward.”
Big Tech Backslide urges social-media companies to:
- reinvest in the teams needed to safeguard election integrity, trust and safety, and moderation;
- reinstate disinformation policies and bolster moderation to limit exposure to violence and lies;
- launch 2024 election-specific platform interventions in time for the February U.S. primaries;
- hold “VIP” and politician accounts to the same standards as those of other users;
- develop more efficient review and enforcement on political-ad content across languages; and
- develop better transparency and disclosure policies and regularly share core metrics data with researchers, journalists, lawmakers and the public.
The report also calls on U.S. lawmakers to codify reforms that:
- minimize data that companies collect and retain;
- ban algorithmic discrimination and establish guardrails against the abuses of AI technology;
- require regular platform transparency and disclosure;
- support a private civil right of action; and
- leverage agency and White House authority to pursue accountability.