Must Carry Reforms Won’t Fix the Internet, But They Could Destroy It Commentary
mohamed_hassan / Pixabay
Must Carry Reforms Won’t Fix the Internet, But They Could Destroy It

Last year, the twenty-six words that created the Internet were threatened by twenty-six proposals to destroy it. This year will be no different. Congress has made it clear they’re willing to do something about Section 230. That “something” could look like Rep DesJarlais’ (R-TN) Protecting Constitutional Rights from Online Platform Censorship Act; another “must-carry” bill that aims to punish websites for their content moderation efforts.

In its current form, Section 230 says websites (and users) are not liable for third-party content, including their decisions about third-party content. While such decisions inarguably fall squarely within the editorial protections afforded by the First Amendment, Section 230 serves as an infallible fast lane towards that inevitable result. Since Section 230’s enactment, market entrants and small online communities have come to rely on that simple yet vital guarantee.

Rep. DesJarlais’ bill would rescind that guarantee, creating a private right of action against any website that restricts access to “protected material.” The bill defines protected material as “material that is protected under the Constitution or otherwise protected under Federal, State, or local law.” This definition is confusing. What does it mean for content to be protected under state law but not under the Constitution (or vice versa)? Or, as Prof. Eric Goldman noted, if a state law “protecting” material has been ruled unconstitutional, is it still unlawful to remove the material?

Regardless, the overall point of these “must-carry” reforms remains the same: websites must-carry any and all First Amendment protected speech. It sounds great in theory, especially for zealous speech advocates. But in practice, it’s a boon for online trolls. To illustrate, consider the following content moderation examples.

Content Case Study 1: Zeran v. AOL

Six days after the infamous and tragic Oklahoma City bombing, an Internet troll by the AOL screen name “Ken ZZ03” posted an appalling message to an AOL message board, offering T-Shirts and merchandise emblazoned w/slogans such as “Visit Oklahoma…It’s a BLAST!!!,” and “Putting the kids to bed…Oklahoma 1995,” and “McVeigh for President 1996” for purchase. All AOL users had to do was “just call Ken.” The post listed a phone number belonging to Ken Zeran. The post went viral after an Oklahoma radio station broadcasted the advertisement to its audience. But Ken Zeran wasn’t selling horrific Oklahoma bombing T-Shirts. He wasn’t even an AOL user. Ken Zeran was a local Seattle artist, in for the absolute worst day of his life. Unable to track down “Ken ZZ03,” Zeran went after AOL instead.

Zeran v. AOL was the first case to ever interpret Section 230. Though technological issues significantly delayed AOL’s removal of the offensive posts, the court ultimately held that according to Section 230, AOL wasn’t liable for the third-party content posted on their service. The outcome was disappointing for plaintiff Ken Zeran, but imperative for the modern web.

Section 230 opponents hold Zeran out as a cautionary tale, suggesting that websites have no incentive to moderate awful content. But today, we’ve seen countless examples of websites stepping up their content moderation efforts. Robust Trust and Safety teams are integral for retaining users and appeasing advertisers. As a result, today’s technology for quickly and effectively responding to abuse is steadily advancing.

But must-carry amendments to Section 230 would, in many ways, roll back the innovative strides websites have made, since AOL. Today, AOL might have had the technical capabilities to respond much quicker to abusive posts. But these days, the constraints on content moderation are sometimes not so much technological as they are regulatory.

In Zeran’s case, T-shirts mocking a national tragedy are protected expressive speech. Though offensive and unacceptable to the majority of AOL’s customers, must-carry laws would discourage removal of such posts. When it comes to doxing, the constitutionality questions aren’t so clear. The First Amendment provides an exception for true threats. But is posting someone’s phone number in conjunction with terrible t-shirts considered a true threat? Probably not.

Under Section 230 (and the First Amendment), none of these considerations matter. Websites can and do remove awful but lawful posts without the burden of potential liability. But under must-carry laws, websites might be inclined to preserve such content; an anti-social and disastrous outcome for online communities.

Content Case Study 2: Niche Websites

Despite the many websites and small communities that make up the worldwide web, must-carry drafters typically have their sights set on “big tech.” In reality, any website that invites user-generated content-like Wikipedia-relies on content moderation to keep abusive content at bay.

For example, consider a niche website like AllTrails. AllTrails provides users with crowd-sourced suggestions for local hiking, biking, and running trails. In addition, AllTrails users can provide reviews and insights about the trails. Importantly, these insights might include whether the trail is snowed-in, dangerous for kids, heavily trafficked (for COVID-19 concerns), or whether it’s easy to get lost. For many trekkers, these reviews can make or break an outing.

Imagine then, how dangerous false and malicious reviews could be for anyone relying upon them. An absurd example is the notorious Mount Everest. The AllTrails Everest page is filled with preposterous reviews, mocking the trail for being easy and kid-friendly. While these fake reviews are probably obvious to anyone visiting the site, similar reviews might not be so glaring for the more obscure trails and peaks.

Though constitutionally permissible, reviews that purposely misguide hikers could very well mean life or death. It’s in AllTrails’ best interest to restrict access to such reviews, but under a “must-carry” regime, doing so might be unlawful.

Content Case Study 3: The Capitol Riots

Lastly, consider the recent atrocities that took place at our Nation’s Capitol last month. The riots were quickly followed by a social media “Red Wedding,” as prominent services took a stand against QAnon, insurrectionists, and even the former President of the United States, Donald Trump. Though a recent Harvard study suggests mass media outlets like Fox News are primarily to blame for the rampant spread of mis/disinformation that got us here today, many point to social media as the ultimate catalyst. If that’s the case, imagine if social media sites like Twitter chose not to act on January 6th, allowing Trump to incite even more violence and insurrection than we saw on the Capitol steps. What then could have occurred on Inauguration Day?

Worse, what if social media websites like Twitter couldn’t act because of must-carry laws like Rep. DesJarlais’ bill? As Berin Szóka noted, thanks to the First Amendment, it’s almost impossible to prosecute anyone for inciting violence online. And though misinformation and disinformation are indeed awful, such content is probably legal.

In Conclusion: Must Carry Is Not the Solution

In the coming months, we’re sure to see more calls and proposals to amend Section 230. Many of those proposals will likely require websites to carry constitutionally protected content, just as Rep. DesJarlais’ bill intends. Such proposals, and their drafters, seem to gravely misunderstand the First Amendment, the Internet, and the critical, yet precarious balance Section 230, in its current state, strikes between protecting the public and preserving free expression online.

While must-carry proposals might sound reasonable, in practice, they’re impossible to reasonably implement as long as trolls exist. Of course, there is one way a must-carry regime could work: websites shut down user-generated content. Perhaps, that’s what Congress wants. Is that what you want?

 

Jess Miers is an Internet Law & Policy Scholar, Research Assistant and third-year law student at Santa Clara University School of Law.

 

Suggested citation: Jess Miers, Must Carry Reforms Won’t Fix the Internet, But They Could Destroy It, JURIST – Student Commentary, February 1, 2021, https://www.jurist.org/commentary/2021/02/jess-miers-section-230-must-carry/.


This article was prepared for publication by Tim Zubizarreta, JURIST’s Managing Editor. Please direct any questions or comments to him at commentary@jurist.org.


Opinions expressed in JURIST Commentary are the sole responsibility of the author and do not necessarily reflect the views of JURIST's editors, staff, donors or the University of Pittsburgh.