Congress is punishing Big Tech. Big Tech wrote the bill.
In January 2024, Mark Zuckerberg walked into a Senate hearing room, sat down across from a row of senators whose children use his products, and apologized. “I’m sorry for everything you have all been through,” he said, turning to face the parents of children harmed online who had been seated behind him. Cameras caught it. The clip went everywhere.
What the clip didn’t show: Zuckerberg had already spent nearly five years publicly calling for the government to regulate platforms like his — including a 2019 Washington Post op-ed arguing the internet “needs new rules.”
That detail is easy to miss in a room full of righteous anger. But it’s the most important thing that happened that day — and it explains more about where internet regulation is headed than anything the senators actually said.
The core immunity provision of Section 230 of the Communications Decency Act is 26 words long. Written in 1996 to protect the nascent web from the liability exposure that would have strangled it in infancy, it says that online platforms cannot be held legally responsible for content posted by their users. It is, depending on who you ask, either the legal foundation of the free internet or a get-out-of-jail-free card that lets trillion-dollar companies profit from harm while facing none of the consequences.
Both parties have spent years building toward the same conclusion: it needs to go. Democrats argue platforms have enabled the exploitation of children, algorithmic radicalization, and the unchecked spread of dangerous content. Republicans argue platforms use their moderation powers to silence conservative voices. The specific grievances are different. The legislative destination is the same.
In December 2025, a bipartisan group of ten senators — Graham, Durbin, Grassley, Hawley, Klobuchar, Blackburn, and others — introduced the Sunset Section 230 Act, a bill that would repeal the law outright within two years. There are at least ten other reform proposals currently in circulation in the 119th Congress. The momentum feels real.
Here is the working hypothesis: it isn’t accountability. It’s consolidation. And the companies Congress is trying to punish are among the quietest advocates for what Congress is about to do.
We’ve seen this movie before. Not as a full feature — just a short, in 2018, when Congress passed SESTA/FOSTA: the Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act. The bills created the first-ever carve-out to Section 230, making platforms liable for content that facilitated prostitution or sex trafficking. The stated goal was unimpeachable: protect children, hold platforms accountable, give victims their day in court.
What happened instead is now a matter of public record.
Platforms didn’t surgically remove trafficking content. They nuked entire categories. Craigslist shut down its personals section the day after the Senate passed the bill. Reddit banned multiple subreddits related to sex work and escort services, part of a broader wave of content-policy purges. Instagram deleted accounts. Niche websites serving LGBTQ communities shut down entirely. Client safety tools used by sex workers to screen dangerous encounters — the digital equivalent of a neighborhood watch — disappeared overnight. The content that had made trafficking prosecutable, because law enforcement could monitor it, migrated to offshore sites and encrypted platforms that cooperated with no one.
Three years later, the Government Accountability Office issued its verdict: federal prosecutors had used FOSTA exactly once. No prosecutor had sought criminal restitution. No plaintiff had been awarded civil damages. And law enforcement officials told the GAO the law had actually made trafficking harder to investigate, because the platforms that used to cooperate with police were gone, replaced by offshore sites that didn’t.
The law solved nothing it promised to solve. It did plenty it didn’t.
This is the mechanism the current repeal push refuses to reckon with. When you make a platform liable for content it can’t perfectly police, the platform doesn’t carefully excise the bad content. It eliminates the category. Platforms don’t do this because they’re cowardly or malicious. They do it because a legal system that holds you liable for what you don’t catch creates an overwhelming incentive to stop looking. The only way to not know is to not check.
The FOSTA story is not primarily a story about sex workers or LGBTQ communities, though the harm to those communities was real and documented. It’s a story about what liability does to platform behavior at scale. That story is about to be told again, with the entire internet as the subject.
Back to Zuckerberg.
In March 2021, Zuckerberg submitted written testimony to the House Energy and Commerce Committee. In it, he proposed his preferred version of Section 230 reform: make immunity conditional on whether platforms can demonstrate they have “systems in place for identifying unlawful content and removing it.” Platforms that build adequate content moderation infrastructure keep their protections. Platforms that don’t, lose them.
It sounded reasonable. The Electronic Frontier Foundation read it immediately for what it was: “an explicit plea to create a legal regime that only Facebook, and perhaps a few other dominant online services, could meet.” The senator who wrote Section 230 agreed. Ron Wyden, Democrat of Oregon, said plainly: “Mark Zuckerberg knows that rolling back Section 230 will cement Facebook’s position as the dominant social media company and make it vastly harder for new startups to challenge his cash cow.”
Zuckerberg’s proposal required platforms to demonstrate “adequate systems” for content moderation — systems “proportionate to platform size.” Facebook, at the time, employed roughly 15,000 content moderators and was developing AI tools to automate the work further. A compliance standard built around Facebook’s existing infrastructure is a compliance standard Facebook already meets. It is also a compliance standard that Substack, Bluesky, Wikipedia, your local newspaper’s comment section, and every independent forum on the internet cannot afford to meet.
This is not a new strategy. It has a name in economics: regulatory capture. George Stigler described it in 1971 — the tendency of regulatory frameworks to be shaped, over time, by the industries they’re meant to regulate, until the regulation serves the regulated rather than the public. Large corporations lobby for regulation of their own industries more often than you’d expect. When they do, it usually looks like public-spirited cooperation. And the compliance cost that is manageable for a trillion-dollar company is existential for its competitors.
Senator Blumenthal, co-sponsor of the Sunset Section 230 Act, said in December 2025 that Zuckerberg had spent “a decade claiming to support Section 230 reforms in public, while their lobbyists and lawyers fight tooth-and-nail behind the scenes.” That framing is probably right about the lawyers. It may miss what the public position is actually doing. You don’t need your lobbyists to kill a bill if the bill, as written, does what you need it to do.
The bipartisan coalition behind repeal is united by anger and divided by everything else. Republicans want less content moderation — they believe platforms have suppressed conservative speech and want liability to deter that behavior. Democrats want more content moderation — they believe platforms have allowed harmful content to flourish and want liability to compel removal. These are not compatible goals. A liability regime that punishes too little moderation and a liability regime that punishes too much moderation cannot be the same liability regime.
Ted Cruz understood this. In a March 2026 Senate Commerce Committee hearing, he said directly that he was “concerned that a full repeal or sunset would lead platforms to engage in more censorship to protect themselves from litigation” — the exact outcome his own constituents are trying to prevent. He is co-sponsoring a bipartisan repeal bill while acknowledging it would produce the opposite of what he wants.
This is the coalition’s fatal incoherence. And it is precisely the incoherence that makes what comes next dangerous.
When you repeal a foundational legal structure without a coherent replacement vision — when the only thing your coalition agrees on is that the current law is bad — you don’t produce a better internet. You produce a vacuum. Vacuums don’t stay empty. They get filled by whoever has the most organized presence in the regulatory space when the dust settles. In Washington, that means the companies with the biggest compliance teams and the deepest lobbying infrastructure.
Senator Marsha Blackburn’s TRUMP AMERICA AI Act, a 291-page discussion draft released in March 2026, illustrates what fills the vacuum. The bill would repeal Section 230 entirely and replace it with a federal content governance framework spanning both platforms and AI systems. Platforms would face liability not just for user content but for “defective design” and “failure to warn” — with enforcement available to federal regulators, state attorneys general, and private plaintiffs alike. Once liability protections vanish, platforms are no longer free to host content neutrally. The question stops being whether reporting is accurate or sourced. It becomes whether hosting it could trigger legal risk.
This is not a hypothetical. It is FOSTA, at scale, with AI layered on top. And Congress is building it in a legislative environment where members cannot agree on what they actually want — only on the fact that the current arrangement has to change.
The children in this debate are real. The harms are real. Nobody serious disputes that platforms have enabled things they should not have enabled, and that the liability shield has at times made accountability harder to achieve. The argument here is not that Section 230 is perfect or that Big Tech deserves its current protections. The argument is that the repeal push, as currently constructed, will not produce what it promises.
Here is what it will produce: a legal environment where the cost of hosting user-generated content is high enough that only the largest platforms can absorb it. Where compliance with whatever replaces Section 230 requires infrastructure that startups, independent publishers, nonprofits, and local news outlets cannot afford to build. Where the platforms Congress is angriest at — the ones with legal teams, lobbying infrastructure, and AI moderation tools already deployed — are the ones best positioned to survive.
And where Mark Zuckerberg, who has been calling for this outcome for nearly a decade, gets exactly what he asked for.
The working hypothesis: Section 230 repeal is being framed as platform accountability and structured as platform entrenchment. The incoherence of the bipartisan coalition — Republicans wanting less moderation, Democrats wanting more — creates the legislative vacuum that a sweeping federal content governance framework is already being built to fill. The only entity with a consistent, legible interest in this outcome across the entire decade of debate is the one that wrote its own preferred reform and ran ads in the New York Times to support it.
What would change my mind
-
If post-repeal legislation included durable carve-outs explicitly exempting small and independent platforms from new liability exposure, and those provisions survived conference — which nothing in the current bill landscape suggests they would.
-
If the post-repeal legal landscape produced measurably more competitive alternatives to Meta and Google rather than fewer, contradicting the structural prediction that compliance costs consolidate markets.
-
If the Blackburn AI bill’s Section 230 provisions were stripped before floor consideration, suggesting there was no organized effort to use the repeal moment as a vehicle for centralized content governance — and that the bill’s architecture was coincidental rather than designed.
Until then, watch what Zuckerberg does, not what Congress says.
Related: The Plaintiff’s Map — A parallel piece on how the absence of anti-SLAPP law in specific states functions as a legal structure favoring institutional plaintiffs — the same dynamic of legal cost as a weapon, one level down from platform liability.
Working Hypothesis tracks its published theses on a public scorecard. This piece makes three falsifiable claims: that Section 230 repeal will increase rather than decrease content censorship on large platforms; that the compliance cost of whatever replaces it will accelerate consolidation toward incumbent platforms; and that no durable small-platform carve-out will survive the legislative process. Check back in 24 months.
If you found this useful, the best thing you can do is forward it to one person who would push back on it. I’d rather be wrong in public than right in private.