The EU AI Act is set to become the first comprehensive AI regulation and to offer a model for policymakers around the world. But with this promise comes some risk. The Act may regulate upstream open source projects as if they are commercial products or deployed AI systems. This would be incompatible with open source development practices and counter to the needs of individual developers and non-profit research organizations.

The Act risks chilling open source AI development and thus could undermine its goals of promoting responsible innovation in line with European values. To help explain why and to offer solutions, we’ve published a policy paper: “Supporting Open Source and Open Science in the AI Act.” Together with a coalition of leading open culture and AI organizations—Creative Commons, EleutherAI, Hugging Face, LAION, and Open Future—we intend for the paper to serve as a resource for policymakers crafting AI regulation. In the EU and beyond, it is essential that policymakers support the blossoming open source AI ecosystem.

Too often, open source and open science for AI have been underappreciated or even misunderstood. Open source has been at the foundation of both AI development and policy. It provides a global public good that can be used, studied, modified, and distributed by all. Best practices pioneered by the open source community, including model documentation, have shaped both transparency requirements and voluntary commitments around the world. To safeguard responsible innovation, collaboration, and competition, AI policy must account for open source.

As the home for all developers, GitHub has represented the open source community in EU deliberations since the introduction of the AI Act in 2021. Our CEO Thomas Dohmke addressed policymakers in Brussels on the need to exempt developers from the AI Act. His speech built on our recommendations to Parliament for a tailored exemption that accounts for the nuances of open source and risks posed by some AI systems. As Parliament finished its deliberations on the Act this spring, we worked with a coalition of open culture and AI development organizations, including our policy paper co-authors, to send a letter to improve the Act. Recently, GitHub Chief Legal Officer Shelley McKinley outlined lessons for policymakers to protect open source as AI policy deliberations go global. We trust that this latest effort will similarly help to advance informed policymaking that works for open source.