Earlier this month, we shared our contribution to a report about content moderation and free expression written by David Kaye, the United Nations Special Rapporteur on freedom of expression. That report is now available.
See the full report
While the report focuses on social media platforms that see large volumes of hate speech and misinformation, use automation to moderate content, and receive government takedown requests based on their Terms of Service, many of Kaye’s points are relevant to GitHub’s users.
For example, on how to respond to government takedown requests, the report cites GitHub’s contribution where it states:
Companies should ensure that requests are in writing, cite specific and valid legal bases for restrictions and are issued by a valid government authority in an appropriate format.
At GitHub, when we receive a government takedown request, we confirm:
- that the request came from an official government agency;
- that the official sent an actual notice identifying the content; and
- that the official specified the source of illegality in that country.
Here are some other relevant points from the report.
Use human rights law as the standard
The report’s top recommendation to companies is to recognize human rights law as “the authoritative global standard for ensuring freedom of expression on their platforms.” As we note in our contribution:
GitHub promotes the freedom of expression in our policies and in our application of those policies to specific cases, consistent with international human rights law’s articulation of the right to freedom of expression and its limitations in the International Covenant on Civil and Political Rights (ICCPR).
The ICCPR allows limitations on free expression when provided by law and necessary, including for the respect of others’ rights or where the content constitutes harassment, abuse, threats, or incitement of violence toward others.
Engage users and be transparent in rulemaking
The report calls for companies to provide public input and engagement, and use transparent rulemaking processes. We develop the rules on our platform collaboratively with our community.
Restrict content as narrowly as possible
The report notes that companies can develop “tools that prevent or mitigate the human rights risks” caused when national laws or demands are inconsistent with international standards.
In these situations, we look for ways to comply that are the least restrictive on human rights—for example, by asking the user to remove part of a repository instead of blocking it entirely, and by geo-blocking content only in the relevant jurisdiction.
Avoid laws that require preventive monitoring or filtering of content
One of Kaye’s five recommendations for governments responds to a concern we raised in our contribution. We explained that measures like the European Union’s proposal to require upload filters for copyright infringement “are overly broad in their scope and, as applied to GitHub and our user community, could be so cumbersome as to prevent developers from being able to launch their work.”
The report noted that “automated tools scanning music and video for copyright infringement at the point of upload have raised concerns of overblocking,” and made this recommendation:
States and intergovernmental organizations should refrain from establishing laws or arrangements that would require the “proactive” monitoring or filtering of content, which is both inconsistent with the right to privacy and likely to amount to pre-publication censorship.
This recommendation is especially timely as the European Parliament’s Legal Affairs Committee advances toward its vote on the proposal on June 21.
Thanks to Special Rapporteur Kaye for his in-depth study of how human rights principles apply to content moderation on online platforms.
If you’d like to participate in the development of our policies, watch our site-policy repository, and look out for posts announcing new policies for public comment on our policy blog or follow our Policy Twitter account. To follow takedowns in real time, watch our gov-takedowns and DMCA repositories.
Written by
I'm GitHub's Director of Platform Policy and Counsel, building and guiding implementation of GitHub’s approach to content moderation. My work focuses on developing GitHub’s policy positions, providing legal support on content policy development and enforcement, and engaging with policymakers to support policy outcomes that empower developers and shape the future of software.