OpenAI GPT-4.1 now available in public preview for GitHub Copilot and GitHub Models

GPT-4.1 release in GitHub Copilot and GitHub Models

OpenAI’s latest model, GPT-4.1, is now available in GitHub Copilot and GitHub Models, bringing OpenAI’s newest model to your coding workflow. This model outperforms GPT-4o across the board, with major gains in coding, instruction following, and long-context understanding. It has a larger context window and features a refreshed knowledge cutoff of June 2024.

OAI has optimized GPT-4.1 for real-world use based on direct developer feedback about: frontend coding, making fewer extraneous edits, following formats reliably, adhering to response structure and ordering, consistent tool usage, and more. This model is a strong default choice for common development tasks that benefit from speed, responsiveness, and general-purpose reasoning.

Copilot

OpenAI GPT-4.1 is rolling out for all Copilot Plans, including Copilot Free. You can access it through the model picker in Visual Studio Code and on github.com chat. To accelerate your workflow, whether you’re debugging, refactoring, modernizing, testing, or just getting started, select “GPT-4.1 (Preview)” to begin using it.

Enabling access

Copilot Enterprise administrators will need to enable access to GPT-4.1 through a new policy in Copilot settings. As an administrator, you can verify availability by checking your individual Copilot settings and confirming the policy for GPT-4.1 is set to enabled. Once enabled, users will see GPT-4.1 in the Copilot Chat model selector in VS Code and on github.com.

To learn more about the models available in Copilot, see our documentation on models and get started with Copilot today.

GitHub Models

GitHub Models users can now harness the power of GPT-4.1 to enhance their AI applications and projects. In the GitHub Models playground, you can experiment with sample prompts, refine your ideas, and iterate as you build. You can also try it alongside other models including those from Cohere, DeepSeek, Meta, and Microsoft.

To learn more about GitHub Models, check out the GitHub Models documentation.

Share your feedback

Join the Community discussion to share feedback and tips.

Llama 4 release on GitHub Models

The latest AI models from Meta, Llama-4-Scout-17B-16E-Instruct and Llama-4-Maverick-17B-128E-Instruct-FP8, are now available on GitHub Models.

Llama-4-Scout-17B is a 17B parameter Mixture-of-Experts (MOE) model optimized for tasks like summarization, personalization, and reasoning. Its ability to handle extensive context makes it well-suited for tasks that require complex and detailed reasoning.

Llama-4-Maverick-17B is a 17B parameter Mixture-of-Experts (MOE) model designed for high-quality chat, creative writing, and precise image analysis. With its conversational fine-tuning and support for text and image understanding, Maverick is ideal for creating AI assitants and applications.

Try, compare, and implement these models in your code for free in the playground (Llama-4-Scout-17B-16E-Instruct and Llama-4-Maverick-17B-128E-Instruct-FP8) or through the GitHub API.

To learn more about GitHub Models, check out the docs. You can also join our community discussions.

See more

GitHub Codespaces has introduced a new Agentic AI feature—you can now open a Codespace running VSCode’s Copilot agent mode, directly from a GitHub issue. With a single click, you can go from issue to implementation!

When you’re in a GitHub issue, the right-hand side of the view now displays a Code with Copilot Agent Mode button in the Development section. Clicking this button initializes a new Codespace, opens the Codespace in a new tab, and enables VSCode’s Copilot agent mode, using the issue body as context. Copilot will then get to work on the issue, thoroughly analyzing the codebase and considering dependencies to suggest appropriate file changes. You can then work with Copilot to fine tune your code and make modifications as required.

VSCode Agent Mode in Codespaces is in public preview, and we’ll be iterating on the experience over the upcoming months. Stay tuned for updates!

See more