GitHub puts the needs of developers at the core of our content moderation policies. Learn more about our approach and how you can contribute.
At GitHub, developers and their interests are at the center of everything we do—including when it comes to moderating content. To maintain a safe, healthy, and inclusive place for software collaboration, we need to ensure our approach to addressing illegal or harmful content suits our developer community. We’re committed to making our content-moderation approach transparent to our users. At the core of our approach, we:
We’ve shared our approach with policymakers, at conferences with others in the tech community, and on the update to the Santa Clara Principles on Transparency and Accountability, which set out minimum standards for platforms when moderating users’ content. As we await the launch of the updated Santa Clara Principles this December, we wanted to share more on our distinctive approach to content moderation.
Developers primarily collaborate on software projects in repositories on GitHub. Owners and maintainers of a repository play an important role in moderating content and activity there. Community members are often well-positioned to know what is in scope of their projects. This is especially true given that a violation could include community norms specific to the project.
To help support community moderation, we encourage owners and maintainers to communicate expectations, make use of permissions, and moderate their projects:
While community moderation is an integral element of our platform, GitHub staff also may need to moderate content and activity on our platform, when it presents a violation of our Terms of Service.
GitHub’s content moderation approach is grounded in international human rights law, including the rights to free expression, association, and assembly. Those rights include limitations for things like hate speech, which correspond to restrictions in our Acceptable Use Policies under our Terms of Service. Our content and conduct restrictions include discriminatory content, doxxing, harassment, sexually obscene content, inciting violence, disinformation, and impersonation.
To protect and promote those rights, we take a human rights-based approach to content moderation. This means that we:
In our transparency reporting about user information disclosure and content removal, we explain how we restrict content in the narrowest way possible to address the violation. For example, we take action at the project (repository) level, rather than across an entire account, where that is enough to address the violation.
We also describe our approach to appeals and reinstatements as a “key component of fairness” in our content moderation process. We allow users the opportunity to refute and/or address violations to get their accounts or content reinstated.
At its core, GitHub is a code collaboration platform where a lot of text and files are software code. Even images and other files often relate to a software project. This makes context particularly important. When people collaborate, they depend on the ongoing availability of others’ work. The nature of software collaboration and open source, in particular, is that one project will often link to and incorporate other projects. As a result, when GitHub restricts content, we need to think about how that will affect a potentially complicated web of interdependencies across the platform. This means we need to be judicious when we remove or otherwise moderate content, which is why we restrict as little content as possible, allow users to appeal, and be transparent about our reasons.
Across the globe, developers have a key opportunity to contribute to policy proposals about platform responsibility. Online collaboration platforms for software development differ in many ways from the platforms that policymakers are targeting. To be effective, content moderation rules will need to recognize that there are many kinds of online platforms and to reflect their differences.
While GitHub’s focus is on code collaboration with community moderation, much of our content on the platform is user generated. This means we are often included in regulations meant for other companies, such as those focused on social media. This can have unintended consequences for developers. Developers can help policymakers understand those impacts and the need to take developers into account. Let us know if you’d like help looking at a particular proposal and thinking about discussion points for policymakers to better understand the implications for you as a developer. You can get in touch with us on Twitter and via our repository.