News & Insights

Home » News & Insights » How to square content moderation with the First Amendment

How to square content moderation with the First Amendment

Legal Developments

These days, social media platforms constitute a large portion of the public sphere. By some measures, Facebook is the single largest news source for Americans. Twitter is a central platform for communication between political leaders and their constituents. Unlike the physical town square of old, however, today’s digital public sphere is controlled by large corporations. Unsurprisingly, questions have emerged over First Amendment matters in this new public sphere. Most online tech giants have shielded themselves from civil liability under Section 230 of the 1996 Communications Decency Act, which essentially separates the platform from the poster of content. Nevertheless, major platforms now engage in extensive “content moderation” to stay in keeping with the law—removing a wide array of content such as hate speech and mis- and disinformation (i.e. “fake news”).

Actors across the political spectrum worry that tech giants now possess vast and largely unchecked discretionary power in moderating speech—out of step with democratic norms and ideals. Some have called for online platforms to be classified as state actors performing a public function so they may be subjected to first amendment jurisprudence, although such an outcome seems unlikely. Others say they should be regulated like public broadcasters. Finally, some have suggested a “toggle” feature to be set by users might best reconcile content moderation with First Amendment concerns. Much like “adult” and “family” settings already in wide usage, a toggle would allow users to choose at will between moderated and unmoderated content from web platforms.

With that in mind, the European Union has shown the most willingness to tax, fine, and regulate the (almost exclusively US) tech companies over the last few years, and it appears likely that its regulatory bodies will continue to take the lead on shaping many of the rules that govern the conduct of the digital public sphere’s masters in Silicon Valley.

Recent Posts

Impact of Shorter COVID-19 Quarantine on Workplaces

On Monday, the CDC announced changes to its recommended isolation and quarantine time from 10 days to 5 days for asymptomatic people with COVID-19. They recommend that people leaving isolation after 5 days continue to wear a mask for the following 5 days. The CDC also...

Restaurants Sue Over Vaccine Mandate

Restaurant operators sued Mayor Bill de Blasio and New York City over Key to NYC, the new indoor vaccine mandate program, on August 17-the same day the mandate went into effect. A group of restaurants in Staten Island, through the Independent Restaurant Owners...

Financial Regulators’ New Target: Social Media Influencers and SPACs

The Financial Industry Regulatory Authority (“FINRA”) will conduct three new regulatory sweeps in an effort to combat various activities causing extreme fluctuations in the financial markets. FINRA has chosen to target special purpose acquisition companies (“SPACs”),...

Does WARN Apply to Virus Closures?

Enterprise, in Benson et al. v. Enterprise Leasing Co. of Florida LLC et al., has tried to argue that the Worker Adjustment and Retraining Notification Act (“WARN”), through its natural disaster exception, does not apply to closures caused by COVID-19. Two Florida...