Member-only story

Section 230: The Clash of Content Moderation and Free Speech

Should social media be held responsible for spreading harmful content?

Ellie S
3 min readDec 30, 2020

--

The Internet sector accounts for about 10 percent of the U.S. GDP in the United States and contributes $2.1 trillion to the nation’s economy, providing more than 18 million job opportunities. This vibrant sector boomed in the U.S. — far more than in European nations — primarily due to legal protections granted through Section 230 of the Communications Decency Act (CDA).

Put simply, Section 230 states that online service providers or intermediaries — such as YouTube, Facebook, Twitter, or any online publisher — cannot be held legally responsible for what users say or do on their platforms. Without Section 230, platforms could be sued for what users publish on their websites, and would therefore, be encouraged to moderate or censor content to avoid legal trouble. Section 230 immunity is not absolute. For instance, the law does not shield online platforms from prosecution under federal criminal statutes or against their own content.

By eliminating the hassle of legal responsibility for users’ content, the CDA helped internet service providers (ISPs) innovate and rapidly expand. In recent years, however, with misinformation and harmful content spreading rapidly via the Internet…

--

--

Ellie S
Ellie S

Written by Ellie S

Artificial Intelligence (AI) Engineer | Computer Scientist interested in tech for social good and technology policy

No responses yet