Skip Navigation
candle light vigil

After Buffalo, Media and Tech Can’t Look Away Any Longer

This tragedy should be a catalyst to a fundamental reckoning.
Get updates:

We respect your privacy

Thanks for signing up!

This piece originally appeared in Tech Policy Press.


There’s a troubling rush to do something — seemingly anything — to strike a blow against the tech companies. But not every solution being cooked up on Capitol Hill right now is a good one. Democrats and Republicans alike have major problems with the tech companies, and there’s bipartisan consensus that legislation is needed — even if the different outcomes lawmakers seek are often diametrically opposed.

One thing is certain: It’s past time for decision-makers in Washington, D.C., to confront the power of companies like Facebook. But exactly how to do it is a challenge, as near-daily revelations show how tech companies continue to profit even as they harm the public in profound ways.

Too many bills on the move have major issues or massive loopholes that need to be closed. If these bills proceed unchanged, they risk opening the door to legal challenges that will unravel existing protections and potentially break the internet and encryption.

We need legislative and regulatory action to stop the widespread abuses of our personal data, which companies exploit to discriminate and weaponize narratives that disproportionately harm communities of color. There are toxic business models at work in Big Tech– and Big Media, too– that cause extensive real-world harms that Congress and regulators must address. These concerns prompted the creation of the Disinfo Defense League’s policy platform — a set of legislative, regulatory and international actions to stop the spread of disinformation and build a media system that serves the public interest developed by advocates for those most affected.

Some of these recommendations are already moving. New leadership at the Federal Trade Commission is poised to initiate a rulemaking on abusive data practices. Lawmakers have introduced promising legislation centering civil rights: For example, the Algorithmic Justice and Online Platform Transparency Act would prohibit the discriminatory use of personal information and provide much-needed transparency into the use of algorithms and content moderation. And the Fourth Amendment Is Not for Sale Act would stop companies and shady data brokers from selling personal data to law-enforcement agencies without a warrant — prompting dozens of advocacy and rights organizations to call on Congress to hold hearings on the bill.

But despite these promising signs, there are several problematic bills seeing markups and possibly heading to floor votes, with consequences that should give pause to anyone dedicated to protecting the open internet and stopping the spread of hate and disinformation. That’s why we’ve expressed concern and opposed many bills lately.

The reform instinct in them may be worthy, and we believe a surgical approach to amending many of these bills could remedy their maladies — but Congress needs to do just that.

Here are some of our biggest concerns on bills that address competition and harmful content:

The American Innovation and Choice Online Act

Promoting competition is one way to hold Big Tech more accountable to the people, but it wouldn’t solve all of the problems we’re facing. For instance, when it comes to competing for attention, disinformation often beats out thoughtful and accurate information — thanks in large part to business models that capitalize on rage and call it engagement. And the American Innovation and Choice Online Act, which recently moved out of committee in the House and the Senate, could cause more problems than it solves unless a significant fix is made.

The bill — which applies only to the largest tech companies like Amazon, Apple, Facebook and Google — could harm ordinary people by undermining platforms’ ability to remove hate and disinformation.

That glaring loophole comes in the form of a provision that says these companies must treat all “similarly situated business users” the same — which opens the door for businesses that peddle disinformation and hate to claim that platforms are “discriminating” against them when their content is removed or they are deplatformed. But creating a requirement for platforms to host or amplify Nazis, white supremacists and the rest of the worst of the worst helps no one.

Open App Markets Act

A provision in this bill could mandate app stores keeping hate speech and misinformation on their platforms. The bill attempts to promote competition and reduce gatekeeping in the app economy — but it would open the door to dangerous and chilling litigation claiming that any move by Apple or Google to remove an app from their stores is really just a way to favor the platforms’ own products and business partners. To avoid the cost and trouble of these lawsuits, companies are more likely to decide to abandon content-moderation practices altogether.

This approach has drawn criticism from organizations and scholars who can find themselves at odds on other issues. Free Press Action joined a letter detailing these concerns that was sent to the Judiciary Committee chairs ahead of the bill’s markup. The letter proposes amendments to make the bill stick to the antitrust issues senators say they want to tackle.

Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT)

Section 230 was once a little-known provision in the Communications Decency Act. But it was thrust into the national spotlight when it became a target of both former President Trump and President Biden. It provides a liability shield for platforms that host third-party content — ranging from websites like YouTube to comments on newspaper articles — and says the hosting platform isn’t liable (usually) for what those third parties say.

There is widespread interest in reforming Section 230, but solutions range from full overhauls to small tweaks. The last two years alone saw nearly 40 different legislative proposals put on the table.

Free Press Action hasn’t endorsed any of the bills that would change the much-discussed but unjustifiably maligned liability shield in Section 230, and we’ve opposed many bills that would amend it. We’ve testified about how important that law is, which simultaneously preserves outlets for free expression while letting platforms take down hate. But we also explained that the shield has been misinterpreted and wrongly applied in too many cases, and noted that changes to 230 would help clarify that platforms are indeed accountable for their own harmful conduct and content.

That means there are some good ideas out there on this law as well as some harmful ones. In fact, the EARN IT Act (first introduced in 2020 and newly returning) is one of the worst. The bill blows holes in Section 230’s liability shield — something that would impact any website that hosts third-party content, not just big platforms like Facebook, TikTok, Twitter or YouTube. And it also poses a huge threat to end-to-end encryption.

There’s an obvious appeal in holding companies accountable for the horrific content they actively promote or fail to take down once they have knowledge of it. But that doesn’t mean it’s a good idea for the tech companies themselves to pre-clear and pass judgment on everything their users post. And it’s an even worse idea to let an attorney general do that instead, which is what EARN IT essentially proposes.

The root of the issue for companies profiting from hate, violence and disinformation is in their data practices themselves. Instead of dismantling too much of Section 230 and risking the nature of the internet as we know it, a better approach would be for Congress to pass bills that prohibit extractive and abusive data practices. They’re what makes it so easy for bad actors to target our communities in ways that endanger our health, our safety and our democracy.

Journalism Competition and Preservation Act

The JCPA would give broadcasters, publishers and other news producers an “antitrust exemption,” in theory to collectively negotiate (i.e., collude) against powerful online platforms like Facebook and Google. But using antitrust immunity to address the long-running local-news crisis isn’t the right way to support a competitive, thriving and independent press in the United States. Indeed, this would lead to a “news media cartel by statute” — and harm smaller publishers that feature diverse and dissenting viewpoints but lack the leverage to benefit from any such negotiations. (And that assumes such negotiations would produce any payments to journalistic outlets at all, which is a stretch.)

The list of JCPA supporters alone should raise eyebrows: They include lobbyists working for Rupert Murdoch’s News Corp., the National Association of Broadcasters, the News Media Alliance and large media conglomerates and hedge funds. Raising serious concerns about the legislation are media-rights organizations and pro-democracy groups that regularly work with journalists and have a proven record for fighting for a free press.

The bill doesn’t address the fundamental problem plaguing the production and distribution of news and information today: a failed commercial marketplace. Thousands of communities across the country lack local and responsive sources for news and these are the same communities that have suffered the most under the companies lobbying hardest for this legislation.

If Congress wants to address the news-and-information crisis in America, it should look to models like the Civic Information Consortium in New Jersey, which invests public dollars to fill in gaps left by the commercial media market. This actually supports the information needs of BIPOC, rural and poor communities, who have long been misrepresented, maligned or outright ignored by local news. To replicate this model on a national scale, Congress could institute a tax on the massively profitable online-advertising sector to create a public fund to support local journalism — and it should reject any schemes to let Rupert Murdoch negotiate with platforms for a bigger piece of the pie.

Careful action is needed

Each of the problems that big platforms pose requires different solutions. But the debates have flattened the details. And when we’re trying to craft good public policy, details matter.

We know the awesome power of media and tech firms to set agendas, social norms — even reality — for the public. At Free Press Action, we believe there is a great danger in the consolidation of power and perspective, and we celebrate the open internet for its possibilities and potential. We have battled mergers between big media companies and fought to protect the open architecture that the internet was built on to ensure that broadband is treated like the essential utility it has become.

We know that consolidation leads to fewer perspectives and believe that it is negligent for the Federal Communications Commission to continue rubber-stamping mergers and license renewals for broadcasters and media outlets that are failing to serve the public interest. We’re fighting to win back permanent Net Neutrality protections for the open internet and increase affordability and access for all — especially communities impacted by systemic racism. And we’re also battling with tech platforms every day to stop the spread of hate and disinformation online.

The good news is that there are regulatory and legislative actions that can address these problems. But we need careful action to solve the important problems our society and democracy are facing from Big Tech and Big Media alike.

More Insights & Opinions