Advancing responsible practices for open source AI
Outcomes from the Partnership on AI and GitHub workshop.

Today, the Partnership on AI (PAI) published a report, Risk Mitigation Strategies for the Open Foundation Model Value Chain. The report provides guidance for actors building, hosting, adapting, and serving AI that relies on open source and other weights-available foundation models. It is an important step forward for responsible practices in the open source AI value chain.
The report is based on a workshop that GitHub recently co-hosted with PAI, as part of our work to support a vibrant and responsible open source ecosystem. Developers build and share open source components at every level of the AI stack on GitHub, amounting to some 1.6 million repositories. These projects range from foundational frameworks like PyTorch, to agent orchestration software LangChain, to models like Grok and responsible AI tooling like AI Verify. Our platform and open data efforts work to make this innovation more accessible and understandable to developers, researchers, and policymakers alike. We evaluate and periodically update our platform policies to encourage responsible development, and we recently joined the Munich Tech Accord to address AI risks in this year’s elections. We work to educate policymakers on the practices, risks, and benefits of open source AI, including in the United States to inform implementation of the Biden Administration’s Executive Order and in the EU to secure an improved AI Act.
Reports like Risk Mitigation Strategies for the Open Foundation Model Value Chain are important resources to inform policy and practice. Policymakers often have a better understanding of vertically integrated AI stacks and the governance affordances of API access than they do of open source and distributed AI collaborations. In addition to beginning to consolidate best practices, the report delineates the open value chain (as pictured below) to provide policymakers a clearer understanding of the distribution of roles and responsibilities in the creation of AI systems today. We look forward to continuing to support responsible open source development and informed AI policy.
Tags:
Written by
Related posts

The developer role is evolving. Here’s how to stay ahead.
AI is changing how software gets built. Explore the skills you need to keep up and stand out.

How GitHub protects developers from copyright enforcement overreach
Why the U.S. Supreme Court case Cox v. Sony matters for developers and sharing updates to our Transparency Center and Acceptable Use Policies.

GitHub Copilot gets smarter at finding your code: Inside our new embedding model
Learn about a new Copilot embedding model that makes code search in VS Code faster, lighter on memory, and far more accurate.