How GitHub protects developers from copyright enforcement overreach
Why the U.S. Supreme Court case Cox v. Sony matters for developers and sharing updates to our Transparency Center and Acceptable Use Policies.

Some platforms handle copyright takedowns with a “three strikes and you’re out” policy, automatically blocking user accounts after repeated notices. While that might sound fair, it can lead to unjust outcomes, especially for open source developers who routinely fork popular projects that could become subject to a copyright takedown. If your account gets suspended, that can have immediate negative consequences not just for you, but for all the open source projects you maintain. Code is collaborative, complex, and often reused, which makes enforcement challenging, and amplifies the downstream effects of takedowns. At GitHub, we take a developer-first approach, reviewing each case individually before blocking accounts, and making sure copyright enforcement doesn’t derail legitimate work.
The U.S. Supreme Court is about to decide a case that could change how platforms like GitHub handle copyright claims — and by extension, it could also impact how you build and share code. In Cox Communications v. Sony Music Entertainment, the question is: When can an internet service provider or platform be held liable for copyright infringement committed by its users? Google, Microsoft (with GitHub), Amazon, Mozilla, and Pinterest have urged the Court to adopt a clear rule: Liability should only apply when there’s “conscious, culpable conduct that substantially assists the infringement,” not merely awareness or inaction.
This matters to developers because the platforms you depend on to host, share, and deploy code rely on legal protections called safe harbors to avoid constant liability for user-generated content. One of the most important safe harbors is Section 512 of the Digital Millennium Copyright Act (DMCA), which shields services from copyright infringement liability as long as they follow a formal notice-and-takedown process. For GitHub, this protection is especially critical given the collaborative nature of open source, the functional role of code, and the ripple effects of removing code that may be widely used.
With over 150 million developers and 518 million projects on GitHub, we process hundreds of DMCA takedowns each month, but also receive thousands of automated, incomplete, or inaccurate notices. If “awareness” alone were enough for liability, platforms could be forced to over-remove content based on flawed notices — chilling innovation and collaboration across the software ecosystem. GitHub’s DMCA Takedown Policy supports copyright protection while limiting disruption for legitimate projects, offering a clear path for appeal and reinstatement, and providing transparency by publishing valid DMCA takedown notices to a public DMCA repository.
This case matters to GitHub as a platform and to all developers who use internet service providers to create and collaborate. We’re in good company: the Supreme Court docket for the case includes amicus briefs from a wide range of civil society stakeholders including Engine Advocacy, the Electronic Frontier Foundation, and Public Knowledge advocating on behalf of free expression, the open internet, and how common-sense limitations on liability make it possible for the modern internet to function. We will continue to monitor the case as it moves forward and remain committed to advocating on behalf of software developers everywhere.
Updates to our Transparency Center
An important aspect of our commitment to developers is our approach to developer-first content moderation. We try to restrict content in the narrowest way possible to address violations, give users the chance to appeal, and provide transparency around our actions.
We’ve updated the GitHub Transparency Center with the first half of 2025 data, which includes a repo of structured data files. In this latest update, we wanted to clarify how we report and visualize government takedowns.
Here’s what we changed:
- We have combined the category of government takedowns received based on local law and based on Terms of Service into one chart/reporting category of Government takedowns received. We made this change to be more accurate in our reporting; the government takedown requests we receive may cite a local law or a Terms of Service violation, but more typically, they are just official requests for the removal of content.
- We have retained the separate categories of “Government takedowns processed based on local law” and “Government takedowns processed based on Terms of Service.” This is an important distinction because it reflects that some content governments ask us to take down is in violation of our Terms and is processed like any other report, whereas some content is not in violation of our terms, but is in violation of local law. In the latter case, we limit the impact on developers by only restricting access to the content in the jurisdiction in which we are legally required to do so, and we publish the request in our gov-takedowns repository to ensure transparency.
- We have also clarified the README of our gov-takedowns repository to note that the repository solely contains official government requests, which led to content removal based on local law.
These are small clarifications, but it’s important to be clear and accurate with the data we share so that researchers studying platform moderation and government suppression of information can use our data. If that applies to you, and you have feedback on our reporting, research to share, or reporting categories you would find useful, open an issue in our transparency center repo.
Updates to our Acceptable Use Policies
We have opened a pull request and 30-day notice-and-comment period for a proposed update to our Acceptable Use Policies (AUP), which would reorganize several existing AUP provisions into separate policies with additional guidance. The new policies include:
- Child Sexual Abuse Material (CSAM)
- Terrorist & Violent Extremist Content (TVEC)
- Non-Consensual Intimate Imagery
- Synthetic Media and AI Tools
The Synthetic Media and AI Tools policy will be extended to explicitly disallow CSAM and TVEC in accordance with international laws. Read more about our approach to deepfake tools.
We invite all stakeholders to review and comment on the proposed Acceptable Use Policy additions for the 30-day period until October 16.
Tags:
Written by
Related posts

GitHub Copilot gets smarter at finding your code: Inside our new embedding model
Learn about a new Copilot embedding model that makes code search in VS Code faster, lighter on memory, and far more accurate.

GitHub Availability Report: August 2025
In August, we experienced three incidents that resulted in degraded performance across GitHub services.

Your guide to GitHub Universe 2025: The schedule just launched!
Create your own agenda of favorites, sign up for one-on-on mentoring sessions, and register if you haven’t already. We’ll see you there!