OpenAI’s latest model, o3-mini, is now available in GitHub Copilot Free.
OpenAI’s latest model, o3-mini, is the most cost-efficient model in their reasoning series. o3-mini outperforms o1 on coding benchmarks with response times that are comparable to o1-mini, meaning you’ll get improved quality at nearly the same latency. The model is configured to use OpenAI’s medium
reasoning effort and can be accessed in VS Code and Github.com Copilot chat today, with support to follow soon in Visual Studio and JetBrains.
Access to o3-mini is currently in preview and is subject to the 50 free chats per month limit when using Copilot Free.
Get started with Copilot Free on GitHub and in VSCode today or learn more in our documentation.