Discover the latest trends and insights on public software development activity on GitHub with the release of Q3 2023 data for the Innovation Graph.
GitHub’s developer-first approach to content moderation
GitHub puts the needs of developers at the core of our content moderation policies. Learn more about our approach and how you can contribute.
At GitHub, developers and their interests are at the center of everything we do—including when it comes to moderating content. To maintain a safe, healthy, and inclusive place for software collaboration, we need to ensure our approach to addressing illegal or harmful content suits our developer community. We’re committed to making our content-moderation approach transparent to our users. At the core of our approach, we:
- Put developers at the center of moderating their own projects: We encourage maintainers to create a code of conduct for their projects and build tools to enable them to moderate their community’s content and conduct in their projects.
- Are fair, empathetic, and transparent in our enforcement actions: We protect developers’ rights. If we need to take action on someone’s content, we want them to understand why.
- Optimize for code collaboration: Because developers rely on each other’s code, when we see a need to act, we apply the least restrictive fix. That way, we minimize the disruptions to collaboration on legitimate content while addressing abuse or other violations on our platform.
We’ve shared our approach with policymakers, at conferences with others in the tech community, and on the update to the Santa Clara Principles on Transparency and Accountability, which set out minimum standards for platforms when moderating users’ content. As we await the launch of the updated Santa Clara Principles this December, we wanted to share more on our distinctive approach to content moderation.
Developers primarily collaborate on software projects in repositories on GitHub. Owners and maintainers of a repository play an important role in moderating content and activity there. Community members are often well-positioned to know what is in scope of their projects. This is especially true given that a violation could include community norms specific to the project.
To help support community moderation, we encourage owners and maintainers to communicate expectations, make use of permissions, and moderate their projects:
- Communicate expectations: Owners and maintainers should use the README file to describe their project and communicate expectations for collaborating on the project. A CONTRIBUTING file can help explain how people can engage. We also recommend they post a CODE_OF_CONDUCT file in their repo to specify what kinds of contributions or participation are—and are not—allowed there. When someone fails to comply with any of the expectations set in those docs, an owner or maintainer can easily point to those as a logical basis to moderate the content or user’s access.
- Moderate content and conduct: Content moderation can be hard work and takes time away from developing software. We’ve built tools to help make it quicker and easier, including the ability to block a user, lock or limit a conversation, and moderate comments.
- Set permissions to allow others to help moderate: Repository owners can also share the moderation with other trusted collaborators by granting them specific moderation privileges.
While community moderation is an integral element of our platform, GitHub staff also may need to moderate content and activity on our platform, when it presents a violation of our Terms of Service.
GitHub’s content moderation approach is grounded in international human rights law, including the rights to free expression, association, and assembly. Those rights include limitations for things like hate speech, which correspond to restrictions in our Acceptable Use Policies under our Terms of Service. Our content and conduct restrictions include discriminatory content, doxxing, harassment, sexually obscene content, inciting violence, disinformation, and impersonation.
To protect and promote those rights, we take a human rights-based approach to content moderation. This means that we:
- take the least restrictive approach to moderating content
- give users the chance to appeal
- provide transparency around our actions
In our transparency reporting about user information disclosure and content removal, we explain how we restrict content in the narrowest way possible to address the violation. For example, we take action at the project (repository) level, rather than across an entire account, where that is enough to address the violation.
We also describe our approach to appeals and reinstatements as a “key component of fairness” in our content moderation process. We allow users the opportunity to refute and/or address violations to get their accounts or content reinstated.
At its core, GitHub is a code collaboration platform where a lot of text and files are software code. Even images and other files often relate to a software project. This makes context particularly important. When people collaborate, they depend on the ongoing availability of others’ work. The nature of software collaboration and open source, in particular, is that one project will often link to and incorporate other projects. As a result, when GitHub restricts content, we need to think about how that will affect a potentially complicated web of interdependencies across the platform. This means we need to be judicious when we remove or otherwise moderate content, which is why we restrict as little content as possible, allow users to appeal, and be transparent about our reasons.
Across the globe, developers have a key opportunity to contribute to policy proposals about platform responsibility. Online collaboration platforms for software development differ in many ways from the platforms that policymakers are targeting. To be effective, content moderation rules will need to recognize that there are many kinds of online platforms and to reflect their differences.
While GitHub’s focus is on code collaboration with community moderation, much of our content on the platform is user generated. This means we are often included in regulations meant for other companies, such as those focused on social media. This can have unintended consequences for developers. Developers can help policymakers understand those impacts and the need to take developers into account. Let us know if you’d like help looking at a particular proposal and thinking about discussion points for policymakers to better understand the implications for you as a developer. You can get in touch with us on Twitter and via our repository.