This package provides an OpenAI-compatible interface for the GitHub Copilot API, designed to work seamlessly with the Vercel AI SDK.
- Full TypeScript support
- Seamless integration with Vercel AI SDK
- Easy to use API matching other AI SDK providers
- Flexible authentication via headers (Bearer token)
- Automatic endpoint switching: Uses
/responses
endpoint for Codex models,/chat/completions
for others - Smart request formatting: Automatically converts
messages
array to OpenAI Responses APIinput
format for Codex models
npm install @opeoginni/github-copilot-openai-compatible
import { createGithubCopilotOpenAICompatible } from '@opeoginni/github-copilot-openai-compatible';
import { generateText } from 'ai';
// Create the provider instance
const githubCopilot = createGithubCopilotOpenAICompatible({
baseURL: 'https://api.githubcopilot.com',
name: 'githubcopilot',
headers: {
Authorization: `Bearer ${process.env.COPILOT_TOKEN}`,
"Copilot-Integration-Id": "vscode-chat", // These configs must be provided
"User-Agent": "GitHubCopilotChat/0.26.7",
"Editor-Version": "vscode/1.104.1",
"Editor-Plugin-Version": "copilot-chat/0.26.7"
},
});
// Use the chat model
const { text } = await generateText({
model: githubCopilot.chatModel('gpt-4o'),
prompt: 'Create a function to calculate the Fibonacci sequence',
});
console.log(text);
The provider automatically handles the different API format for Codex models:
import { createGithubCopilotOpenAICompatible } from '@opeoginni/github-copilot-openai-compatible';
const githubCopilot = createGithubCopilotOpenAICompatible({
baseURL: 'https://api.githubcopilot.com',
name: 'githubcopilot',
headers: {
"Copilot-Integration-Id": "vscode-chat", // These configs must be provided
"User-Agent": "GitHubCopilotChat/0.26.7",
"Editor-Version": "vscode/1.104.1",
"Editor-Plugin-Version": "copilot-chat/0.26.7"
},
apiKey: process.env.COPILOT_TOKEN
});
// This will automatically use the /responses endpoint with 'item' format
const codexModel = githubCopilot.chatModel('gpt-5-codex');
const { text } = await generateText({
model: codexModel,
prompt: 'Write a Python function to sort a list',
});
import { createGithubCopilotOpenAICompatible } from '@opeoginni/github-copilot-openai-compatible';
// Minimal setup - just provide the auth token and required headers
const githubCopilot = createGithubCopilotOpenAICompatible({
baseURL: 'https://api.githubcopilot.com',
name: 'githubcopilot',
headers: {
"Copilot-Integration-Id": "vscode-chat", // These configs must be provided
"User-Agent": "GitHubCopilotChat/0.26.7",
"Editor-Version": "vscode/1.104.1",
"Editor-Plugin-Version": "copilot-chat/0.26.7"
},
apiKey: process.env.COPILOT_TOKEN
});
const model = githubCopilot.chatModel('gpt-4o');
claude-opus-4
- Claude Opus 4claude-opus-41
- Claude Opus 4.1claude-3.5-sonnet
- Claude 3.5 Sonnetclaude-3.7-sonnet
- Claude 3.7 Sonnetclaude-3.7-sonnet-thought
- Claude 3.7 Sonnet with Reasoningclaude-sonnet-4
- Claude Sonnet 4claude-sonnet-4.5
- Claude Sonnet 4.5
gpt-4.1
- GPT-4.1gpt-4o
- GPT-4 Optimizedgpt-5
- GPT-5gpt-5-codex
- GPT-5 Codex (uses/responses
endpoint)gpt-5-mini
- GPT-5 Mini
gemini-2.0-flash-001
- Gemini 2.0 Flashgemini-2.5-pro
- Gemini 2.5 Pro
grok-code-fast-1
- Grok Code Fasto3
- OpenAI O3o3-mini
- OpenAI O3 Minio4-mini
- OpenAI O4 Mini
Plus any custom model ID supported by Github Copilot (type-safe with TypeScript)
The provider automatically routes requests to the correct endpoint based on the model ID:
-
Codex Models (
gpt-5-codex
or any model containing 'codex'):- Uses
/responses
endpoint - Converts
messages
array to OpenAI Responses APIinput
format - Transforms message content parts (e.g.,
text
→input_text
,image_url
→input_image
) - System messages become
developer
role messages
- Uses
-
All Other Models:
- Uses standard
/chat/completions
endpoint - Uses standard OpenAI-compatible
messages
array format
- Uses standard
This means you don't need to worry about the underlying API differences - the provider handles it automatically!
Creates a new GitHub Copilot provider instance.
Options:
baseURL
(required): Base URL for API callsname
(required): Provider nameheaders?
: Custom headers to include in requestsqueryParams?
: Optional URL query parametersfetch?
: Custom fetch implementationincludeUsage?
: Include usage information in responsessupportsStructuredOutputs?
: Enable structured outputs support
Returns: A provider instance with chatModel()
and languageModel()
methods, also callable as a function.
Pre-configured default provider instance with common GitHub Copilot headers. You'll need to provide authentication separately.
To use this provider, you'll need a GitHub Copilot API token. This is typically obtained through GitHub Copilot's authentication flow in VS Code or other supported editors.
MIT