AI SDK provider for GitHub Copilot - Use Copilot CLI via Vercel AI SDK
npm install @nomomon/ai-sdk-provider-github-copilot





Vercel AI SDK community provider for GitHub Copilot — use streamText, generateText, and related AI SDK APIs with GitHub Copilot as the backend model.
---
Follow the Copilot CLI installation guide to install the CLI and ensure copilot is available in your PATH.
``bash`
npm install @nomomon/ai-sdk-provider-github-copilot ai@^6.0.0
`typescript
import { generateText } from "ai";
import { githubCopilot } from "@nomomon/ai-sdk-provider-github-copilot";
const { text } = await generateText({
model: githubCopilot("gpt-5"),
prompt: "Hello, Copilot!",
});
console.log(text);
`
`typescript
import { streamText } from "ai";
import { githubCopilot } from "@nomomon/ai-sdk-provider-github-copilot";
const result = streamText({
model: githubCopilot("gpt-5"),
prompt: "Tell me a short story",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
`
Use model IDs available via Copilot CLI. Run copilot -i /models to list available models in your environment.
| Script | Description |
|--------|-------------|
| npm run example:basic | Non-streaming generateText |npm run example:streaming
| | Streaming streamText |npm run example:tools
| | Model-level tools via Copilot defineTool |npm run example:tools-ai-sdk
| | Call-level AI SDK tool() with providerOptions bridge |
`typescript
import { githubCopilot } from "@nomomon/ai-sdk-provider-github-copilot";
const model = githubCopilot("gpt-5", {
model: "claude-sonnet-4.5", // Override model
systemMessage: {
content: "You are a helpful assistant specialized in code review.",
},
workingDirectory: "/path/to/project",
provider: {
type: "openai",
baseUrl: "https://my-api.example.com/v1",
apiKey: process.env.MY_API_KEY,
},
});
`
Tools can be passed in two ways. When both are used, call-level tools are merged with model-level tools before creating the Copilot session.
1. Copilot's defineTool (model-level) — configure tools when creating the model:
`typescript
import { defineTool } from "@github/copilot-sdk";
import { z } from "zod";
import { githubCopilot } from "@nomomon/ai-sdk-provider-github-copilot";
const model = githubCopilot("gpt-5", {
tools: [
defineTool("lookup_issue", {
description: "Fetch issue details from our tracker",
parameters: z.object({
id: z.string().describe("Issue identifier"),
}),
handler: async ({ id }) => {
return await fetchIssue(id);
},
}),
],
});
`
2. AI SDK tool() with providerOptions (call-level) — the AI SDK does not pass execute to providers. Use copilotToolOptions(execute) so the provider can convert the tool and use it as the Copilot handler:
`typescript
import { copilotToolOptions, githubCopilot } from "@nomomon/ai-sdk-provider-github-copilot";
import { streamText, tool } from "ai";
import { z } from "zod";
const execute = async ({ city }: { city: string }) => ({
city,
temperature: ${20 + Math.floor(Math.random() * 15)}°C,
condition: "sunny",
});
const getWeather = tool({
description: "Get the current weather for a city",
inputSchema: z.object({
city: z.string().describe("The city name"),
}),
execute,
providerOptions: copilotToolOptions(execute),
});
const result = streamText({
model: githubCopilot("gpt-5-mini"),
tools: { get_weather: getWeather },
prompt: "What's the weather in Tokyo?",
});
`
Tool support varies by model; verify with Copilot CLI or documentation.
Tests live in tests/ (not colocated with source) to keep the published src/ tree clean and to exclude test files from the build. Each source module has a corresponding tests/*.test.ts file. Run npm test or npm run test:coverage for coverage with thresholds.
- Requires Copilot CLI - Must be installed and authenticated
- Node.js >= 18 - Required runtime
- Image inputs - Copilot uses file path attachments; base64/data URLs may require temp files
- Unsupported parameters - temperature, maxTokens, topP`, etc. are not supported by the Copilot CLI and will be ignored (warnings emitted)
- Structured outputs - Native JSON schema support may be limited; consider prompt engineering for structured responses
- Session-based - Each generate/stream creates a new session; no built-in multi-session continuity across separate AI SDK calls
- Model capabilities - Tools, reasoning, and other features may vary by model; capabilities are determined by Copilot CLI
This is an unofficial community provider and is not affiliated with or endorsed by GitHub or Vercel. By using this provider:
- You understand that your data will be sent to GitHub's servers through the Copilot CLI
- You agree to comply with GitHub's Terms of Service
- You acknowledge this software is provided "as is" without warranties of any kind