Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> For example, YouTube shared videos containing COVID-19 misinformation 20 million times, generating 71 million reactions in eight months (Gallo, 2021).

How would reworking Section 230 fix this? Most of that information isn't punishable by the federal government, it falls under first amendment protections.

A lot of the most dangerous speech online is protected speech, the exclusions here like libel or incitement to violence are very narrow. For better or worse, the government can't punish people over saying that vaccines are dangerous. The only entities that can legally crack down on that information are the private entities that control their own platforms.

I often find in critiques of Section 230 that people have (for lack of a better word) an optimistic view of what the government can and can't do in regards to speech. Remember that a lot of the TV content from stations like Fox News are not covered by Section 230, and they're still legal. If the government had the ability to shut that misinformation down, why would those networks still be operating today? Even just regulations on how sorting algorithms work for social media are not certain to pass a Supreme Court challenge.

----

You link to the FCC rules on broadcasting, here's what they state:

> FCC rules specifically say that the "public harm must begin immediately, and cause direct and actual damage to property or to the health or safety of the general public, or diversion of law enforcement or other public health and safety authorities from their duties."

Covid misinformation is obviously bad and harmful, it has obviously made the pandemic worse and people have died because of it. Covid misinformation also doesn't rise to the standard that the FCC sets above; the Supreme Court has ruled multiple times that "immediate" harm is a pretty narrow category, and that causing "direct and actual damage" is also a kind of high bar to clear. The reality is that even if the government got rid of Section 230, it couldn't ban vaccine misinformation from Facebook. At best, it could impose large liabilities that made Facebook very nervous about having unvetted speech of any kind, as well as making it dangerous for any competitors or smaller companies to try and compete without a large legal team backing them up -- in other words, exactly the chilling effects and market consequences that people warn about whenever these bills come up.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: