From fake news to copyright infringement, content moderation—and who should do what to address it—is all over the news and policymaking arenas. Although we are a platform that hosts primarily code uploaded by developers, many of those discussions are relevant to GitHub.
Earlier this year, United Nations Special Rapporteur on the right to freedom of opinion and expression, David Kaye, visited GitHub’s headquarters to discuss how content moderation on our platform affects free expression. His visit was part of his research for a report he will present to the United Nations Human Rights Council for its adoption in June. To gather views from governments, companies, and others, Special Rapporteur Kaye issued a call for written submissions with questions on topics ranging from how companies handle takedown requests to what role automation plays (and should play) in content moderation.
In GitHub’s response to the Special Rapporteur’s questions:
- We walk through our processes for handling takedown requests (government takedowns and copyright infringement notices under the Digital Millennium Copyright Act (DMCA)) and we describe how we work to reduce abuse on our platform without unnecessarily chilling speech. For instance, we geo-block content if it’s not illegal globally and we consider the right of fair use in handling DMCA takedown notices.
- We highlight how we promote transparency, for example by involving our community in the development of the policies that govern use of our platform and by posting takedown notices in public repos in real time. We explain that users can appeal removal of content and that we’ll provide reasons for our decision.
- We note that our approach is consistent with international human rights law—specifically Articles 19 and 20 of the International Covenant on Civil and Political Rights, which establish the right to free expression and prohibition of propaganda and hate speech. We also explain that we designed our Community Guidelines to protect the interests of marginalized groups and encourage users to respect each other.
- Finally, we explain that we open source our site policies (we’re GitHub, after all!) and hope that our approach gets recognized as a best practice that other platforms adopt.
Contributing to Special Rapponteur Kaye’s report is one way we’re working to define and build on best practices for platform moderation. We also directly participate in the discourse about content moderation, for example at last week’s Conference Moderation Summit and this week at RightsCon. In addition, we continue to advocate for approaches to content moderation that promote transparency and free expression while limiting abuse.
We thank the Special Rapporteur for his thoughtful attention to this timely issue and we look forward to reading his report!