GitHub Copilot CLI now lets you connect your own model provider or run fully local models instead of using GitHub-hosted model routing. This means you can use the models and providers you’re already paying for, operate in air-gapped environments, and maintain direct control over your LLM spend, all while keeping the same agentic terminal experience.

Connect any model provider

Configure Copilot CLI to use Azure OpenAI, Anthropic, or any OpenAI-compatible endpoint by setting a few environment variables before launching the CLI. This works with remote services like OpenAI and Azure OpenAI, as well as locally running models like Ollama, vLLM, and Foundry Local. See Using your own LLM models in GitHub Copilot CLI for setup instructions.

Offline mode for air-gapped environments

Set COPILOT_OFFLINE=true to prevent Copilot CLI from contacting GitHub’s servers. In offline mode, all telemetry is disabled and the CLI only communicates with your configured provider. Combined with a local model, this enables fully air-gapped development workflows. See Running in offline mode for details.

GitHub authentication is now optional

When using your own model provider, GitHub authentication is not required. You can start using Copilot CLI immediately with just your provider credentials. If you also sign in to GitHub, you get the best of both worlds: your preferred model for AI responses, plus access to features like /delegate, GitHub Code Search, and the GitHub MCP server. See Unauthenticated use for more information.

What you need to know

  • Your model must support tool calling and streaming. For best results, use a model with at least a 128k token context window.
  • Built-in sub-agents (explore, task, code-review) automatically inherit your provider configuration.
  • If your provider configuration is invalid, Copilot CLI shows actionable error messages—it never silently falls back to GitHub-hosted models.
  • Run copilot help providers for quick setup instructions directly in the terminal.

Join the discussion within GitHub Community.