Repealing Section 230
The Internet Remains Binary
Practically since the Web was spun, web site owners have found protection in Section 230 of the 1996 Communications Decency Act.
Platforms aren’t publishers, and can’t be held responsible for content the way, say, I am, according to the act. Substack can’t know what I’m going to say, any more than the phone company can be sued over what I say.
The problem is that Web content isn’t transient like a phone call. It’s published like a newspaper. Practically since the act was passed lawyers have been trying to break through the shield, using the act’s aim of protecting kids as their sword. This week the artists formerly known as Facebook were told to hand New Mexico $375 million on this issue.
The state created a fake profile of a teenager, like a “honeypot” meant to draw spam. It worked. The fake profile drew lots of child predators. The state then said that Meta “put profits over kids’ safety.” That’s always the censor’s case. A jury is also deliberating in a California case against Google, the plaintiff claiming they became addicted to social media at a young age, and that YouTube’s algorithm was the cause, not the videos using it.
You can write what you want. Government can’t stop you. But what if you’re writing software? Section 230 protects software as though it’s speech. But is it?
Code is Still Law
Code is still law. But outside questions about speech, it’s already covered by law.
If you’re writing code for Visa or Charles Schwab that can move money, you must follow a host of local, state, federal, and international laws. There are also private laws, like those from market makers, that must be obeyed. Failure to obey can put you out of business. If you’re told of a fault, refuse to fix it, and it gets exploited, you can even go to jail. Somehow, these companies manage to stay in business.
The argument of the plaintiff’s bar is that Google and Meta should be held to the same standards as Visa and Schwab. Once they’re notified that their software is abusing people, they should have an obligation to fix the bug or be held liable for the harm it caused.
The authors of Section 230 say they always faced a “crummy choice” between policing everything and policing nothing. That’s still the choice. Advocates for keeping the law say repeal would protect Big Tech by raising compliance costs smaller competitors could not meet and making them censors. Opponents say the law was meant to create neutrality, but today’s algorithms are anything but neutral.
Maybe, with Meta programmers being sent home to prepare for layoffs, they should be sent to buy suits and come back as lawyers.
Accountability
Both sides are right. As I’ve been writing since the law was passed, code is binary. The bit is on or it’s off. Law is analog. Calculus can only provide an estimate of a wave’s size. But calculus and saturation arithmetic do this well enough that listeners can’t tell the difference.
Accountability requires notice of harm and a process. If someone tells me I lied in an article, or violated copyright, I can argue the point or remove the content. I’m a little guy. I remove the content. I don’t often face this problem, because I’m careful and because I’m usually not worth bothering about. In either case, libel and copyright law are clear.
Applying these laws to code that works on speech is the problem. But regardless of what America does, there’s a whole world out there that will make other choices. The real danger is not that Section 230 will be repealed, but that the Internet will become balkanized, with each nation rigorously censoring software across borders, and going after the authors. In that world, maybe nothing is said.
But we’ve been dealing with that fear since Gutenberg. The Internet is now about as old as printing was when Luther nailed his 95 theses to the wall of a church in Wittenburg. This debate over code is going to spiral and no one, not even Mark Zuckerberg, is prepared for it.




https://www.eff.org/issues/cda230
'Section 230 allows for web operators, large and small, to moderate user speech and content as they see fit. This reinforces the First Amendment’s protections for publishers to decide what content they will distribute. Different approaches to moderating users’ speech allows users to find the places online that they like, and avoid places they don’t'.
It seems to me algo restriction legislation is more important than publishing responsibilities.
Moderation is being AI'd fast which is likely to have a major impact on free speech, already filtered and throttled by algos.
We need to play pinochle some day