Law & Public Policy Blog

The Growing Case for Revisiting Section 230

Alexander Fried, Law & Public Policy Scholar, JD Anticipated May 2022

What do the following claims have in common?

“Nancy Pelosi diverting Social Security money for the impeachment inquiry”

“Trump’s grandfather was a pimp and a tax evader; his father was a member of the KKK.”

“AOC (Alexandria Ocasio-Cortez) proposed a motorcycle ban.”

First, all of these claims are patently false. But perhaps more importantly, they all were believed true. All of these claims were among the most viewed fake news stories on Facebook in 2019. The growing influence of disinformation and its disastrous effects on our society has become commonplace in recent American discourse. From its effect on everything from our elections to sports, disinformation is now something discussed regularly around dinner tables.

However, we have just reached a turning point on how disinformation affects society. In the past month, false claims—claims about cures, accusations, and general lies—regarding the coronavirus have packed our households. These assertions have ranged from bleach being a cure for the virus to the virus being bioengineered by China as a weapon. Although disinformation has plagued important subjects of discussion before—such as climate change or foreign interference with elections—never before has our health or life been so directly and quickly impacted by both the lack of trust in truthful information and the overwhelming presence of falsehoods on our televisions and computers. The way Americans consume information is broken, and civil society’s coronavirus response reflects that.

The United States needs to deal with two different kinds of epidemics: that of the virus and that of rampant disinformation, which inhibits civil society’s ability to function. This is not the first or the last problem that requires a collective response from Americans. In coming years, climate change, an increasingly automated workforce, and general social polarization all necessitate collective action and solutions. But collective solutions are impossible when there is no consensus on what the problem is or how it can be fixed. The United States needs to renew public trust in information. Therefore, blatantly false information that reasonably induces action must be checked.

Currently, this is impossible thanks to Section 230 of the Communications Decency Act of 1996. Section 230 is a short, 1000-word section that reads in part, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In 1996, this made sense. Websites were only beginning to exist—revenue generation was a distant fantasy. Back then, websites had relatively low traffic and did not have the capability to police its users. If a user wanted to “create” content, the websites wouldn’t be held liable.

But now, Facebook, Twitter, and other websites all have platforms that allow—and arguably encourage—disinformation. Although none of these companies create content, their platforms’ enormous reach—not the content creators alone—is the reason why disinformation has become so harmful. And we have seen the effects of this harm over the past few weeks. The effect is twofold: first, there continue to be falsehoods spread on social media regarding the virus. From racist accusations towards the Chinese to more serious accusations of governments attempting to spread chaos through falsehoods, disinformation has infected our population with ugly and dishonest beliefs. But second, the presence of this disinformation causes general distrust in real, honest information. Not only does it propagate false views, it causes civil society to become more divided. And when there is a real threat to the United States that requires collective action, this distrust causes real damage.

Despite the internet changing, the law has basically remained the same. Websites like Facebook or Backpage are immune from liability for claims arising from content posted on them. This is an overly broad protection. Companies are protected from “claims of negligent misrepresentation, interference with business expectancy, breach of contract, intentional nuisance, violations of federal civil rights, and emotional distress.” This law allows big tech to shrug at human trafficking, ignore foreign interference with our elections, and even permit nonconsensual nudes to be posted on their platform. Again, Section 230 states that these companies are neither speakers nor publishers of content. In short, it says if you did not create it, you cannot be sued for it…despite your platform spreading it to its 2.5 billion users. This needs to be changed.

Luckily, there is some bipartisan wide support. Vice President Joe Biden radically said to the New York Times Editorial Board that Section 230 should be revoked in its entirety. Attorney General Barr has stated that he “is concerned that Section 230 immunity has been extended far beyond what Congress originally intended.” He even added that “Section 230 has enabled platforms to absolve themselves completely of responsibility for policing their platforms, while blocking or removing third-party speech—including political speech—selectively, and with impunity.” Missouri Senator Josh Hawley (R) has even gone so far as to introduce legislation that calls for an unprecedented regulation of the internet. The New York Times likewise wrote an op-ed examining both sides of Section 230 reform. Almost everyone in Washington agrees the law needs to do more to hold internet companies liable. In a divided and now socially distant world, here is something that brings us together.

There are some serious concerns with revoking Section 230 outright, however. Free speech purists have declared Section 230 to be akin to the 1st Amendment, claiming that revoking Section 230 in its entirety would essentially encourage internet companies to censor everything that could cause a lawsuit—regardless of its truthfulness or untruthfulness. The lobbying wing of big tech—the Internet Association—has claimed that Section 230 is “the one line of federal code that has created more economic value in this country than any other.” While the benefits are overexaggerated and the cautions are hyperbolic, Section 230 is absolutely responsible for the growth of the internet—especially social networking.

But times have changed from 1996. Backpage has made millions of dollars off of sex trafficking. Facebook is a Fortune 100 company, and its platform was used to influence a nation. Countless accounts of revenge porn, death threats, and other gruesome forms of content are put on forum sites every day, and the victims have no legal recourse against the forum that either passively tolerates unethical use or, like in the case of Backpage, actively encourages it.

Opponents to Section 230 reform also point out that Facebook and other websites have taken steps to self-police. The problem is that until Section 230 is revoked or changed, we can only take them at their word. Until individuals have legal recourse, we cannot access or evaluate the effort that these companies are expending on policing because of immunity.

And all of these issues have been considered and discussed previously. But now, we have hit a milestone in this saga. Uncertainty over what needed to be done regarding the coronavirus—caused by everything from lies about the virus being a hoax to the suggestions that we all drink bleach—is going to have a devastating effect on our lives. Changing Section 230 to reflect a modern internet where some responsibility, even if just a tiny sliver, lies with Big Tech will force online platforms to reasonably regulate what news is spread and restore America’s trust in information. The other two alternatives are to do nothing and allow disinformation to go unchecked, or to pass sweeping legislation that polices the internet. If I were in Big Tech’s shoes, I would start talking about reasonable adjustments to Section 230 reform. Indeed, it is always better to regulate yourself before you are regulated.

This is just the beginning of many steps that the United States can take to restore civil society’s trust in information. Although we missed the boat on recreating a sense of trust before the coronavirus, we can prepare for the next disaster.