Learn how to change the default LLM for Copilot Chat to a different model.
By default, Copilot Chat uses GPT-4.1 to provide fast, capable responses for a wide range of tasks, such as summarization, knowledge-based questions, reasoning, math, and coding.
However, you are not limited to using this model. You can choose from a selection of other models, each with its own particular strengths. You may have a favorite model that you like to use, or you might prefer to use a particular model for inquiring about a specific subject.
To view the available models per client, see Supported AI models in GitHub Copilot.
Note
Different models have different premium request multipliers, which can affect how much of your monthly usage allowance is consumed. For details, see Requests in GitHub Copilot.
Copilot allows you to change the model during a chat and have the alternative model used to generate responses to your prompts.
Changing the model that's used by Copilot Chat does not affect the model that's used for Copilot code completion. See Changing the AI model for GitHub Copilot code completion.
Further readingRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4