Community AI SDK provider for Google Gemini using the official CLI/SDK
npm install ai-sdk-provider-gemini-cliA community provider for the Vercel AI SDK that enables using Google's Gemini models through @google/gemini-cli-core and Google Cloud Code endpoints.
| Provider Version | AI SDK Version | NPM Tag | Branch |
| ---------------- | -------------- | ----------- | ----------- |
| 2.x | v6 | latest | main |
| 1.x | v5 | ai-sdk-v5 | ai-sdk-v5 |
| 0.x | v4 | ai-sdk-v4 | ai-sdk-v4 |
``bashAI SDK v6 (default)
npm install ai-sdk-provider-gemini-cli ai
Installation
1. Install and authenticate the Gemini CLI:
`bash
npm install -g @google/gemini-cli
gemini # Follow the interactive authentication setup
`2. Add the provider to your project:
`bash
npm install ai-sdk-provider-gemini-cli ai
`Quick Start
`typescript
import { generateText } from 'ai';
import { createGeminiProvider } from 'ai-sdk-provider-gemini-cli';const gemini = createGeminiProvider({
authType: 'oauth-personal',
});
const result = await generateText({
model: gemini('gemini-3-pro-preview'),
prompt: 'Write a haiku about coding',
});
console.log(result.text);
`Authentication
$3
Uses credentials from
~/.gemini/oauth_creds.json created by the Gemini CLI:`typescript
const gemini = createGeminiProvider({
authType: 'oauth-personal',
});
`$3
`typescript
const gemini = createGeminiProvider({
authType: 'api-key',
apiKey: process.env.GEMINI_API_KEY,
});
`Get your API key from Google AI Studio.
Supported Models
-
gemini-3-pro-preview - Latest model with enhanced reasoning (Preview)
- gemini-3-flash-preview - Fast, efficient model (Preview)
- gemini-2.5-pro - Previous generation model (64K output tokens)
- gemini-2.5-flash - Previous generation fast model (64K output tokens)Features
- Streaming responses
- Tool/function calling
- Structured output with Zod schemas
- Multimodal support (text and base64 images)
- TypeScript support
- Configurable logging
Configuration
`typescript
const model = gemini('gemini-3-pro-preview', {
temperature: 0.7,
maxOutputTokens: 1000,
topP: 0.95,
});
`$3
`typescript
// Disable logging
const model = gemini('gemini-3-flash-preview', { logger: false });// Enable verbose debug logging
const model = gemini('gemini-3-flash-preview', { verbose: true });
// Custom logger
const model = gemini('gemini-3-flash-preview', {
logger: {
debug: (msg) => myLogger.debug(msg),
info: (msg) => myLogger.info(msg),
warn: (msg) => myLogger.warn(msg),
error: (msg) => myLogger.error(msg),
},
});
`Examples
See the examples/ directory for comprehensive examples:
-
check-auth.mjs - Verify authentication
- basic-usage.mjs - Text generation
- streaming.mjs - Streaming responses
- generate-object-basic.mjs - Structured output with Zod
- tool-calling.mjs - Function calling`bash
npm run build
npm run example:check
npm run example:basic
`Breaking Changes
$3
- Provider interface: ProviderV2 → ProviderV3
- Token usage: flat → hierarchical structure
- Warning format:
unsupported-setting → unsupported
- Method rename: textEmbeddingModel() → embeddingModel()
- Finish reason: string → { unified, raw } objectSee CHANGELOG.md for details.
Limitations
- Requires Node.js >= 20
- OAuth requires global Gemini CLI installation
- Image URLs not supported (use base64)
- Some parameters not supported:
frequencyPenalty, presencePenalty, seed`This is an unofficial community provider, not affiliated with Google or Vercel. Your data is sent to Google's servers. See Google's Terms of Service.
MIT