Skip to content

GitHub scores high in content moderation report

See how GitHub protects users against online censorship.

null
Author

Developers rely on GitHub to maintain a healthy, safe, and inclusive platform that promotes collaborative development. Sometimes this means we need to remove or restrict access to content that violates the law, our Terms of Service, or our Community Guidelines. It’s crucial for GitHub to be clear about how we moderate content and have an appropriate process in place to inform and protect users. Last week, the Electronic Frontier Foundation (EFF) released its annual “Who Has Your Back Report”, focusing this year on online censorship.

This year’s report assesses tech companies’ policies on certain aspects of content moderation and awards a star in a number of categories that change each year. The 2019 categories focus on transparency, notice, and appeals. As a top-scoring company, GitHub received five out of six possible stars. Read on for more context around the categories, and how GitHub approaches some of the issues raised in the report.

Of the 16 companies assessed this year, GitHub is one of only three to receive five or more stars. The report noted that GitHub, as well as Reddit, have “unique models of user-generated content and communities,” and that both “have managed to meet industry best practices while also adapting those best practices to the community models and types of content they host.”

It also recognized that we’re helping to raise the industry standard despite—or perhaps because of—our smaller-sized teams compared to other companies:

Both Reddit and GitHub also employ small policy and content moderation teams relative to some of the other companies we assess this year. If they can achieve this outstanding level of transparency and accountability with regard to content takedowns and account suspensions, it’s reasonable to expect other companies to also meet that standard.

The report assessed companies’ content moderation policies in six categories:

  1. Transparent about legal takedown requests
  2. Transparent about platform policy takedown requests
  3. Provides meaningful notice
  4. Appeals mechanisms
  5. Appeals transparency
  6. Santa Clara Principles

The report excludes spam, phishing, and child exploitation imagery from its assessment in each category.


Transparent about takedown requests from governments

Categories one and two are about how companies handle requests to remove content, or “takedown requests,” from governments. The first category is about governments claiming a violation of a law, which the report refers to as “legal takedowns”. We explained in our 2018 Transparency Report:

In 2018, GitHub received nine requests—all from Russia—resulting in nine projects (all or part of three repositories, five gists, and one GitHub Pages site) being blocked in Russia.

GitHub takes this one step further. As the report shares, we also provide additional transparency for users by publicly posting takedown notices from governments.

The second category is about governments claiming a violation of our Terms of Service, which the report refers to as “platform policy takedown requests”. We also noted this in our transparency report:

GitHub received zero requests from governments to take down content as a Terms of Service violation.

To meet the report’s criteria for both categories, we explained how many requests we received, where the request is from, and which ones we acted on.

Meaningful notice

In the notice category, the report looked at both government and Terms of Service-based takedowns and suspensions. To earn a star, we needed to publicly commit to “provide meaningful notice to users of every removal and suspension, unless prohibited by law, in very narrow and defined emergency situations, or if doing so would be futile or ineffective.” This includes identifying the specific content that allegedly violates the law or Terms of Service. In addition, where we geographically restrict or “geoblock” the scope of a government takedown, we needed to notify the affected user of the geographic scope of the takedown.

GitHub does all of the above. We also describe our practice of providing notice in our contribution to the United Nations free expression expert’s report on online content moderation.

Appeals

Like we mentioned earlier, the report gave two stars in the appeals category—one for appeals mechanism and one for appeals transparency. GitHub earned a star for appeals mechanism because we allow users to appeal decisions. We aren’t currently set up to accurately and systematically report on appeals, which means we didn’t earn a star for appeals transparency. To earn that star, we need to:

regularly publish records of appeals and their aggregate outcomes, for instance in a transparency report. This should include, at a minimum, the information necessary to determine the number of appeals filed.

Support for the Santa Clara Principles

The last category is about supporting a set of principles for companies that host user-generated content. The Santa Clara Principles set a standard for how companies can create meaningful processes for users and enforcement of rules that is “fair, unbiased, proportional, and respectful of users’ rights.” The criteria for the other categories in this report roughly mirror these principles.

As we note in GitHub’s 2018 Transparency Report:

transparency reporting has broadened as people pay more attention to companies’ practices on information disclosure and removal. One recent example is the Santa Clara Principles on Transparency and Accountability of Content Moderation Practices. We support the spirit of those principles and are working to align our practices with them with as much as possible. Through our transparency reports, we’re continuing to shed light on our own practices, while also hoping to contribute to broader discourse on platform governance.

We know that the issues explored in this report cover only part of the content moderation issue. GitHub is committed to keeping our platform safe, healthy, and inclusive while allowing developers to do their best work. We take a careful, thoughtful approach to moderating content, as we’ve shown with some examples in this post.

Content moderation is an evolving issue, with new laws, industry principles, industry-wide discussions, and government concerns that require a closer look at our content moderation policies and practices. We’re continually working to improve GitHub’s tracking, tooling, and other resourcing—and looking to leverage emerging best practices. Moving forward, we plan to continue collaborating with other platforms, developers, policymakers, and civil society to help set a high industry standard. We’d love to hear from you if you have ideas for us.


Working on your own best practices for online censorship or content moderation?
Read EFF’s report or contact the GitHub Policy Team via email or Twitter.

Explore more from GitHub

Company

Company

The latest on GitHub, from GitHub.
GitHub Universe 2024

GitHub Universe 2024

Get tickets to the 10th anniversary of our global developer event on AI, DevEx, and security.
GitHub Copilot

GitHub Copilot

Don't fly solo. Try 30 days for free.
Work at GitHub!

Work at GitHub!

Check out our current job openings.