Announcing GitHub Copilot Pro+

Today, we’re introducing GitHub Copilot Pro+, a new individual tier for developers who want to take their coding experience to the next level.

Enjoy all the features you love from GitHub Copilot Pro along with exclusive access to the latest models (GPT-4.5 is available today), priority access to previews, and 1500 premium requests per month when they go live on May 5th. This is in addition to the unlimited requests for agent mode, context-driven chat, and code completions that all paid plans have when using our base model. Editor’s note (April 23, 2025): Rate limiting is in place to accommodate for high demand. Learn more about current models and usage.

To get started, purchase GitHub Copilot Pro+ today and don’t miss out on the other Copilot updates shared in our announcement blog.

Stay tuned for updates as we work to continuously enhance your developer experience. Also, please share any feedback in GitHub Community.

Anthropic Claude 3.7 Sonnet, Claude 3.5 Sonnet, OpenAI o3-mini, and Google Gemini Flash 2.0 are now generally available in GitHub Copilot. With this change, these models are promoted from preview release terms to generally available release terms. This extends indemnification for IP infringement to code generated using these models in Copilot Chat and agent mode.

Copilot hero image

  • Claude 3.7 Sonnet, Anthropic’s most advanced model to date, excels in development tasks that require structured reasoning across large or complex codebases.
  • Claude 3.5 Sonnet remains a good choice for everyday coding support.
  • OpenAI o3-mini is a fast, cost-effective reasoning model designed to deliver coding performance while maintaining lower latency and resource usage.
  • Gemini 2.0 Flash is Google’s model optimized for fast responses and multimodal interactions.

Learn more about the models available in Copilot in our documentation and get started with Copilot today.

See more

The official open source GitHub MCP Server

Today we’re releasing a new open source, official, local GitHub MCP Server. We’ve worked with Anthropic to rewrite their reference server in Go and improve its usability. The new server contains 100% of the old server’s functionality plus the ability to customize tool descriptions, support for code scanning and a new get_me function that improves the natural language user experience when asking the LLM things like: “Show me my private repos.”

To get started, visit the repository and learn how to set up the GitHub MCP Server, which is now supported natively in VS Code.

Model Context Protocol (MCP) is an AI tool calling standard that has been rapidly gaining adoption over the past few months. MCP tools give LLMs a standardized way to call functions, look up data, and interact with the world. Anthropic created the protocol and built the first GitHub MCP server, which grew to be one of the most popular MCP servers in the expanding ecosystem. We are excited to take ownership of the server and continue its development.

Join the discussion within GitHub Community.

See more