As policymakers grapple with how to address hate speech and disinformation on the internet, they’re eying the legal structure underpinning collaborative software development: legal safe harbors. These safe harbors protect online platforms from liability for the user content that they host, and without these safe harbors, countless user-generated content platforms like GitHub wouldn’t be here today.
GitHub and developers everywhere can once again help policymakers understand that collaborative software development relies on legal safe harbors, and that tinkering with those safe harbors threatens the reliability and stability of software and puts innovation at risk.
While the internet is a different place than it was 20 years ago, policymakers should not lose focus of the drivers and contours of the problems they are trying to address. Many times there is an underlying problem beyond what platforms can or should do to avoid liability. Where the online dimension is relevant, policymakers should recognize that platform liability is not the only variable, and instead focus on other levers that directly address the underlying problems. Where platform liability is directly relevant, policymakers should ensure that efforts to cut into legal safe harbors for platforms do not undermine open source software collaboration and innovation, which developers and businesses rely on and is foundational to the world’s infrastructure today.
Internet in the 90s
Intermediary liability protection exists in the laws of many countries. Some of the earliest examples come from the U.S., particularly Section 230 of the Communications Decency Act (CDA) and Section 512 of the Digital Millennium Copyright Act (DMCA).
Section 230 and the DMCA both date back to the late 1990s. Back then, the internet was relatively new and policymakers’ concerns about it centered on online pornography, defamation, and piracy. These laws are important to understand as they set a framework that exists throughout the world today and have enabled internet innovation. To understand how and why, let’s get into some background.
CDA Section 230
Under the First Amendment to the U.S. Constitution, you’re not liable for distributing other people’s content unless you know or should have known it has illegal content. But if you publish other people’s content, you’re subject to the same potential liability as the author. Senator Ron Wyden introduced Section 230 as an amendment to the CDA after a court ruled that a company is liable for allegedly defamatory content they host (as a “publisher” of speech) if they moderate some content, but not if they avoid moderation altogether. This phenomenon is known as the moderator’s dilemma (it creates an incentive not to moderate content at all) because you could be held legally responsible if you start trying. The U.S. Supreme Court later ruled that most of the law, which criminalized engaging in online speech that was “indecent” or “patently offensive” if the speech could be viewed by a minor, violated the First Amendment because it was overly broad and vague. Section 230 is the only part of the CDA that survived that ruling.
Section 230 added a safe harbor, which shields platforms from liability for user content that they host:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
even if they moderate that content:
No provider or user of an interactive computer service shall be held liable on account of—(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
As a result, platforms can moderate content without being liable as a “publisher or speaker” under U.S. law. It includes some exceptions, for example, for violations of federal criminal law or intellectual property law.
DMCA
Intellectual property does come into play, however, thanks to another law passed around the same time—the DMCA. The motivation for that law was online piracy, and the approach also includes a safe harbor that protects platforms from liability for copyright-infringing content that they host. That safe harbor, Section 512, sets out conditions that, if met, will protect a platform from liability for copyright infringement. At the center of this approach is a notice and takedown process that says a platform isn’t liable for copyright infringement in content it hosts if it “expeditiously” takes down content when they receive a notice of copyright infringement that meets requirements specified in the law. The approach also outlines a counter notice process, where the person accused of copyright infringement in a DMCA takedown notice can appeal by responding with a notice and, if conditions are met, platforms would reinstate the content.
GitHub’s DMCA Takedown Policy explains the importance of this process to availability of content on the internet:
The DMCA provides a safe harbor for service providers that host user-generated content. Since even a single claim of copyright infringement can carry statutory damages of up to $150,000, the possibility of being held liable for user-generated content could be very harmful for service providers. With potential damages multiplied across millions of users, cloud-computing and user-generated content sites like YouTube, Facebook, or GitHub probably never would have existed without the DMCA (or at least not without passing some of that cost downstream to their users). The DMCA addresses this issue by creating a copyright liability safe harbor for internet service providers hosting allegedly infringing user-generated content. Essentially, so long as a service provider follows the DMCA’s notice-and-takedown rules, it won’t be liable for copyright infringement based on user-generated content.
Some very smart lawyers answered common questions about the DMCA’s safe harbor, if you want to learn more.
The DMCA and CDA Section 230 have stood the test of time (in internet years), remaining virtually intact—with rare and contested exceptions.
What’s happening now
As the internet has become more widespread, and objectionable activity has continued to happen, it’s not surprising to see objectionable activity happening on the internet. A lot of attention has turned to whether tech companies can or should do more to address, if not solve, these problems. Because in many cases the objectionable content or activity originates from users, the legal safe harbors have done their job of (generally speaking) putting the liability for that content or activity on the person who posts it.
Yet, as bad things continue to happen online, the conversation is increasingly turning to whether policymakers should revisit the scope of these legal safe harbors so that platforms would take on more responsibility for the content that their users post.
How you can contribute
Developers have helped policymakers understand in tangible terms the risk of removing the legal safe harbor for software developers. For example, during negotiations of the EU Copyright Directive, which contemplated needing to proactively filter content to preserve the safe harbor, developers explained:
Software projects are often made up of hundreds of dependencies too, which can be licensed in different ways. When a filter catches a false positive and dependencies disappear, this not only breaks projects—it cuts into software developers’ rights as copyright holders too.
This led to a more nuanced outcome than the original legislative proposal and protects software development platforms from risking losing their safe harbor under that directive.
As we continue to see proposals to cut into platforms’ safe harbors, developers have another opportunity to contribute to these debates by urging a nuanced approach that does not interfere unnecessarily with developers’ freedoms to build, create, and innovate.
Look out for our next post where we’ll dive into some current examples from around the world, including related to CDA Section 230 and DMCA Section 512, and how you can contribute specifically to those discussions.
Written by
I'm GitHub's Director of Platform Policy and Counsel, building and guiding implementation of GitHub’s approach to content moderation. My work focuses on developing GitHub’s policy positions, providing legal support on content policy development and enforcement, and engaging with policymakers to support policy outcomes that empower developers and shape the future of software.