Models and Multipliers
Requests, Premium Requests, and Multipliers
A request, in general, is any interaction you have with Copilot - asking a question, requesting a code snippet, or asking for an explanation. A premium request is a request that uses more computational resources than a standard one. For example, requests that involve complex reasoning, large context windows, or combine multiple steps in thinking. GitHub does not disclose the exact criteria they use to classify a request as premium, but you can assume that requests that take longer to process or require more processing power are more likely to be classified as premium.
The following table summarizes the potential premium request consumption for different Copilot features:
| Feature | Premium request consumption |
|---|---|
| Copilot Chat | 1 premium request per user prompt. |
| Copilot coding agent | 1 premium request per session. A session begins when you ask Copilot to create a pull request or make one or more changes to an existing pull request. |
| Agent mode in Copilot Chat | 1 premium request per user prompt. |
| Copilot code review | 1 premium request is used each time Copilot posts comments to the pull request. |
| Copilot Extensions | 1 premium request per user prompt. |
| Copilot Spaces | 1 premium request per user prompt. |
| Spark | 4 premium requests per prompt. |
Some of these features, such as Copilot Chat, Agent mode, Copilot extensions, and Copilot Spaces, can be configured to use different models. The model you choose will affect the cost of the premium request. For example, for GPT-4.1 the multiplier is 0, while for Claude Opus 4 it is 10.
The math is simple:
Cost of a premium request = Base cost of the model x Multiplier of the model
The following table summarizes the base cost and multiplier for different models:
| Model | Multiplier for paid plans |
|---|---|
| GPT-4.1 | 0 |
| GPT-5 mini |
Building with GitHub Copilot
From Autocomplete to Autonomous AgentsEnroll now to unlock all content and receive all future updates for free.
