Skip to content

Learning from EFF’s report on censorship and online platforms

This year's report from the EFF focuses on government takedown requests and how companies protect their users from unwarranted censorship.

Learning from EFF’s report on censorship and online platforms
Author

The Electronic Frontier Foundation (EFF) publishes an annual “Who Has Your Back Report” to evaluate which companies defend their users when the government comes knocking. Since 2011, the report has focused on government requests for user information. This year, the report takes on a different topic: government requests to take down information—in other words, censorship on online platforms.

As background, EFF explains how the prevalence of HTTPS and mixed-use social media sites have made it harder for governments themselves to directly censor content. As a result, governments are increasingly turning to online platforms to censor for them.

EFF used five criteria to rate how well companies (“some of the biggest online platforms that publicly host a large amount of user-generated content”) protect their users from unwarranted censorship:

  1. Transparency in reporting on government takedown requests based on violations of a law
  2. Transparency in reporting on governments takedown requests based on terms of service or other platform policy violations
  3. Meaningful notice to users
  4. Appeals process for users
  5. Limited geographic scope of takedowns

Based on EFF’s description of those criteria, GitHub meets each one. As we explain in our contribution to the UN’s free expression expert’s report on content moderation, we minimize censorship on our platform by providing transparency, notice, appeals, and geographically limited blocking when we find a takedown unavoidable.

Among EFF’s observations in the report are that companies that scored well “serve to provide examples of strong policy language for others hoping to raise the bar on content moderation policy” and that helping companies to review each other’s policies around content moderation “can serve as a guide for startups and others looking for examples of best practices.” A strong motivation behind open sourcing our policies is that we hope to contribute to industry best practices while offering those examples to startups and others who are looking for them. We recognize how important transparency is in how we develop our policies. We also recognize that being transparent about how we moderate content is essential to maintaining our community’s trust and our legitimacy as a platform.

We thank EFF for taking on online censorship in this year’s report. Get in touch with us through email or Twitter if you’re interested in collaboration toward raising the standard among companies involved in online content moderation.

Explore more from GitHub

Policy

Policy

Shaping laws, regulations, norms, and standard practices.
The ReadME Project

The ReadME Project

Stories and voices from the developer community.
GitHub Copilot

GitHub Copilot

Don't fly solo. Try 30 days for free.
Work at GitHub!

Work at GitHub!

Check out our current job openings.