How GitHub contributed to the Santa Clara Principles update
GitHub was honored to contribute to the Santa Clara Principles on Transparency and Accountability in Content Moderation 2.0.
Platform responsibility, accountability, and transparency play an increasingly important role in society. Along these lines, GitHub was honored to contribute to the Santa Clara Principles on Transparency and Accountability in Content Moderation 2.0.
What are the Santa Clara Principles?
The Santa Clara Principles were first introduced in 2018 by a civil society coalition to set baseline standards for platforms’ content moderation practices informed by due process and human rights. GitHub is among 12 major companies that publicly support the principles, and we continue to thoughtfully assess and refine our own practices to ensure they are closely aligned with them.
The original Santa Clara Principles focused on using numbers, notice, and appeals as a way for platforms to ensure that their content moderation practices were “fair, unbiased, proportional, and respectful of users’ rights.” The second iteration takes a deeper dive into the expectations set out in the original operational principles. A new set of five overarching foundational principles include integrity and explainability, cultural competence, and state involvement.
See the full report and read on to learn about the key takeaways for developer platforms and how we contributed.
GitHub’s contribution
GitHub’s contribution to the Santa Clara Principles update was informed by our human-centered, developer-first approach to content moderation and transparency. As we look toward the future of content policy and moderation, we are keenly aware as a collaborative software development platform that there is no one-size-fits-all approach. Our goal is to provide a safe, inclusive space for all developers to do their best work together. For us, that means being transparent–both in our policies and our moderation actions–taking the least restrictive approach when enforcing our policies, and standing up for developers’ rights. Thus, our contributions focused on clarity around policies, proportionality with respect to platform size and resources, transparency, and clearly defining a few key terms.
Key takeaways for developer platforms
On cultural competence
This fundamental new principle states that “those making moderation and appeal decisions understand the language, culture, and political and social context of the posts they are moderating.”
In our contribution, GitHub recommended:
“Given the global nature of the internet and of many user bases, it could be useful to recommend that moderators have language and/or cultural competency to moderate content of users from different regions, countries, or cultures than their own, as relevant to the nature of content on the platform and degree of risk to users.”
We also recommended that a platform should be transparent when they have limited cultural competency. While cultural competence is a crucial principle that online platforms should take into account, it’s also important to note that a platform’s size and resources may play a role in being able to achieve this.
On the Notice principle
In expanding the Notice principle, the Santa Clara Principles 2.0 stated:
“Companies must provide notice to each user whose content is removed, whose account is suspended, or when some other action is taken due to non-compliance with the service’s rules and policies, about the reason for the removal, suspension or action. Any exceptions to this rule, for example when the content amounts to spam, phishing or malware, should be clearly set out in the company’s rules and policies.” |
In our contribution, we suggested that there may be circumstances where it would be appropriate to allow for an exception to providing a detailed reason for the moderation action where it may be counterproductive. This exception could apply to situations, such as spam or phishing, or where user accounts appear to have been created solely for a purpose that violates a platform’s terms, because these users may not otherwise use their accounts again. We are pleased to see exceptions for some of those categories incorporated in the updated principles, although we believe all users should be provided with notice, even where this exception applies.
On state involvement
A particularly important new foundational principle, State Involvement in Content Moderation, stated:
“Companies should recognize the particular risks to users’ rights that result from state involvement in content moderation processes. This includes a state’s involvement in the development and enforcement of the company’s rules and policies, either to comply with local law or serve other state interests. Special concerns are raised by demands and requests from state actors (including government bodies, regulatory authorities, law enforcement agencies and courts) for the removal of content or the suspension of accounts.” |
In responding to the question of whether there were “specific risks to human rights which the Santa Clara Principles could better help mitigate by encouraging companies to provide specific additional types of data,” we suggested platforms publicly post notices they act on when taking action against a user on the basis of a law. GitHub publishes notices we process from governments seeking content removal on the basis of their local laws. We also report on actions taken based on violations of our Terms of Service where the report originated with a government agency. Ultimately, we want our users to be informed, and believe that transparency on a specific and ongoing level is essential to good governance.
On what’s next
GitHub is powered by a global community of developers collaborating on software. With this in mind, we take a collaborative approach to content moderation, evolving our policies in conversation with civil society, industry, and human rights frameworks. We are proud to have contributed to the update of the Santa Clara Principles and will continue lending the perspective of the open source software community to the development of platform policy standards.
Santa Clara Principles 2.0 has toolkits for advocates and companies interested in spreading the word and incorporating the principles.
Follow GitHub Policy on Twitter for updates about the laws and regulations that impact developers.
Tags:
Written by
Related posts
Inside the research: How GitHub Copilot impacts the nature of work for open source maintainers
An interview with economic researchers analyzing the causal effect of GitHub Copilot on how open source maintainers work.
OpenAI’s latest o1 model now available in GitHub Copilot and GitHub Models
The December 17 release of OpenAI’s o1 model is now available in GitHub Copilot and GitHub Models, bringing advanced coding capabilities to your workflows.
Announcing 150M developers and a new free tier for GitHub Copilot in VS Code
Come and join 150M developers on GitHub that can now code with Copilot for free in VS Code.