An open-source AI agent that brings the power of Grok directly into your terminal.
npm install @vibe-kit/grok-cliA conversational AI CLI tool powered by Grok with intelligent text editor capabilities and tool usage.
- 🤖 Conversational AI: Natural language interface powered by Grok-3
- 📝 Smart File Operations: AI automatically uses tools to view, create, and edit files
- ⚡ Bash Integration: Execute shell commands through natural conversation
- 🔧 Automatic Tool Selection: AI intelligently chooses the right tools for your requests
- 🚀 Morph Fast Apply: Optional high-speed code editing at 4,500+ tokens/sec with 98% accuracy
- 🔌 MCP Tools: Extend capabilities with Model Context Protocol servers (Linear, GitHub, etc.)
- 💬 Interactive UI: Beautiful terminal interface built with Ink
- 🌍 Global Installation: Install and use anywhere with bun add -g @vibe-kit/grok-cli
bash
bun add -g @vibe-kit/grok-cli
`Or with npm (fallback):
`bash
npm install -g @vibe-kit/grok-cli
`$3
`bash
git clone
cd grok-cli
bun install
bun run build
bun link
`Setup
1. Get your Grok API key from X.AI
2. Set up your API key (choose one method):
Method 1: Environment Variable
`bash
export GROK_API_KEY=your_api_key_here
`Method 2: .env File
`bash
cp .env.example .env
Edit .env and add your API key
`Method 3: Command Line Flag
`bash
grok --api-key your_api_key_here
`Method 4: User Settings File
Create
~/.grok/user-settings.json:
`json
{
"apiKey": "your_api_key_here"
}
`3. (Optional, Recommended) Get your Morph API key from Morph Dashboard
4. Set up your Morph API key for Fast Apply editing (choose one method):
Method 1: Environment Variable
`bash
export MORPH_API_KEY=your_morph_api_key_here
`Method 2: .env File
`bash
Add to your .env file
MORPH_API_KEY=your_morph_api_key_here
`$3
By default, the CLI uses
https://api.x.ai/v1 as the Grok API endpoint. You can configure a custom endpoint if needed (choose one method):Method 1: Environment Variable
`bash
export GROK_BASE_URL=https://your-custom-endpoint.com/v1
`Method 2: Command Line Flag
`bash
grok --api-key your_api_key_here --base-url https://your-custom-endpoint.com/v1
`Method 3: User Settings File
Add to
~/.grok/user-settings.json:
`json
{
"apiKey": "your_api_key_here",
"baseURL": "https://your-custom-endpoint.com/v1"
}
`Configuration Files
Grok CLI uses two types of configuration files to manage settings:
$3
This file stores global settings that apply across all projects. These settings rarely change and include:
- API Key: Your Grok API key
- Base URL: Custom API endpoint (if needed)
- Default Model: Your preferred model (e.g.,
grok-code-fast-1)
- Available Models: List of models you can useExample:
`json
{
"apiKey": "your_api_key_here",
"baseURL": "https://api.x.ai/v1",
"defaultModel": "grok-code-fast-1",
"models": [
"grok-code-fast-1",
"grok-4-latest",
"grok-3-latest",
"grok-3-fast",
"grok-3-mini-fast"
]
}
`$3
This file stores project-specific settings in your current working directory. It includes:
- Current Model: The model currently in use for this project
- MCP Servers: Model Context Protocol server configurations
Example:
`json
{
"model": "grok-3-fast",
"mcpServers": {
"linear": {
"name": "linear",
"transport": "stdio",
"command": "npx",
"args": ["@linear/mcp-server"]
}
}
}
`$3
1. Global Defaults: User-level settings provide your default preferences
2. Project Override: Project-level settings override defaults for specific projects
3. Directory-Specific: When you change directories, project settings are loaded automatically
4. Fallback Logic: Project model → User default model → System default (
grok-code-fast-1)This means you can have different models for different projects while maintaining consistent global settings like your API key.
$3
Important: Grok CLI uses OpenAI-compatible APIs. You can use any provider that implements the OpenAI chat completions standard.
Popular Providers:
- X.AI (Grok):
https://api.x.ai/v1 (default)
- OpenAI: https://api.openai.com/v1
- OpenRouter: https://openrouter.ai/api/v1
- Groq: https://api.groq.com/openai/v1Example with OpenRouter:
`json
{
"apiKey": "your_openrouter_key",
"baseURL": "https://openrouter.ai/api/v1",
"defaultModel": "anthropic/claude-3.5-sonnet",
"models": [
"anthropic/claude-3.5-sonnet",
"openai/gpt-4o",
"meta-llama/llama-3.1-70b-instruct"
]
}
`Usage
$3
Start the conversational AI assistant:
`bash
grok
`Or specify a working directory:
`bash
grok -d /path/to/project
`$3
Process a single prompt and exit (useful for scripting and automation):
`bash
grok --prompt "show me the package.json file"
grok -p "create a new file called example.js with a hello world function"
grok --prompt "run bun test and show me the results" --directory /path/to/project
grok --prompt "complex task" --max-tool-rounds 50 # Limit tool usage for faster execution
`This mode is particularly useful for:
- CI/CD pipelines: Automate code analysis and file operations
- Scripting: Integrate AI assistance into shell scripts
- Terminal benchmarks: Perfect for tools like Terminal Bench that need non-interactive execution
- Batch processing: Process multiple prompts programmatically
$3
By default, Grok CLI allows up to 400 tool execution rounds to handle complex multi-step tasks. You can control this behavior:
`bash
Limit tool rounds for faster execution on simple tasks
grok --max-tool-rounds 10 --prompt "show me the current directory"Increase limit for very complex tasks (use with caution)
grok --max-tool-rounds 1000 --prompt "comprehensive code refactoring"Works with all modes
grok --max-tool-rounds 20 # Interactive mode
grok git commit-and-push --max-tool-rounds 30 # Git commands
`Use Cases:
- Fast responses: Lower limits (10-50) for simple queries
- Complex automation: Higher limits (500+) for comprehensive tasks
- Resource control: Prevent runaway executions in automated environments
$3
You can specify which AI model to use with the
--model parameter or GROK_MODEL environment variable:Method 1: Command Line Flag
`bash
Use Grok models
grok --model grok-code-fast-1
grok --model grok-4-latest
grok --model grok-3-latest
grok --model grok-3-fastUse other models (with appropriate API endpoint)
grok --model gemini-2.5-pro --base-url https://api-endpoint.com/v1
grok --model claude-sonnet-4-20250514 --base-url https://api-endpoint.com/v1
`Method 2: Environment Variable
`bash
export GROK_MODEL=grok-code-fast-1
grok
`Method 3: User Settings File
Add to
~/.grok/user-settings.json:
`json
{
"apiKey": "your_api_key_here",
"defaultModel": "grok-code-fast-1"
}
`Model Priority:
--model flag > GROK_MODEL environment variable > user default model > system default (grok-code-fast-1)$3
`bash
grok [options]Options:
-V, --version output the version number
-d, --directory
set working directory
-k, --api-key Grok API key (or set GROK_API_KEY env var)
-u, --base-url Grok API base URL (or set GROK_BASE_URL env var)
-m, --model AI model to use (e.g., grok-code-fast-1, grok-4-latest) (or set GROK_MODEL env var)
-p, --prompt process a single prompt and exit (headless mode)
--max-tool-rounds maximum number of tool execution rounds (default: 400)
-h, --help display help for command
`$3
You can provide custom instructions to tailor Grok's behavior to your project or globally. Grok CLI supports both project-level and global custom instructions.
#### Project-Level Instructions
Create a
.grok/GROK.md file in your project directory to provide instructions specific to that project:`bash
mkdir .grok
`Create
.grok/GROK.md with your project-specific instructions:
`markdown
Custom Instructions for This Project
Always use TypeScript for any new code files.
When creating React components, use functional components with hooks.
Prefer const assertions and explicit typing over inference where it improves clarity.
Always add JSDoc comments for public functions and interfaces.
Follow the existing code style and patterns in this project.
`#### Global Instructions
For instructions that apply across all projects, create
~/.grok/GROK.md in your home directory:`bash
mkdir -p ~/.grok
`Create
~/.grok/GROK.md with your global instructions:
`markdown
Global Custom Instructions for Grok CLI
Always prioritize code readability and maintainability.
Use descriptive variable names and add comments for complex logic.
Follow best practices for the programming language being used.
When suggesting code changes, consider performance implications.
`#### Priority Order
Grok will load custom instructions in the following priority order:
1. Project-level (
.grok/GROK.md in current directory) - takes highest priority
2. Global (~/.grok/GROK.md in home directory) - fallback if no project instructions existIf both files exist, project instructions will be used. If neither exists, Grok operates with its default behavior.
The custom instructions are added to Grok's system prompt and influence its responses across all interactions in the respective context.
Morph Fast Apply (Optional)
Grok CLI supports Morph's Fast Apply model for high-speed code editing at 4,500+ tokens/sec with 98% accuracy. This is an optional feature that provides lightning-fast file editing capabilities.
Setup: Configure your Morph API key following the setup instructions above.
$3
When
MORPH_API_KEY is configured:
- edit_file tool becomes available alongside the standard str_replace_editor
- Optimized for complex edits: Use for multi-line changes, refactoring, and large modifications
- Intelligent editing: Uses abbreviated edit format with // ... existing code ... comments
- Fallback support: Standard tools remain available if Morph is unavailableWhen to use each tool:
-
edit_file (Morph): Complex edits, refactoring, multi-line changes
- str_replace_editor: Simple text replacements, single-line edits$3
With Morph Fast Apply configured, you can request complex code changes:
`bash
grok --prompt "refactor this function to use async/await and add error handling"
grok -p "convert this class to TypeScript and add proper type annotations"
`The AI will automatically choose between
edit_file (Morph) for complex changes or str_replace_editor for simple replacements.MCP Tools
Grok CLI supports MCP (Model Context Protocol) servers, allowing you to extend the AI assistant with additional tools and capabilities.
$3
#### Add a custom MCP server:
`bash
Add an stdio-based MCP server
grok mcp add my-server --transport stdio --command "bun" --args server.jsAdd an HTTP-based MCP server
grok mcp add my-server --transport http --url "http://localhost:3000"Add with environment variables
grok mcp add my-server --transport stdio --command "python" --args "-m" "my_mcp_server" --env "API_KEY=your_key"
`#### Add from JSON configuration:
`bash
grok mcp add-json my-server '{"command": "bun", "args": ["server.js"], "env": {"API_KEY": "your_key"}}'
`$3
To add Linear MCP tools for project management:
`bash
Add Linear MCP server
grok mcp add linear --transport sse --url "https://mcp.linear.app/sse"
`This enables Linear tools like:
- Create and manage Linear issues
- Search and filter issues
- Update issue status and assignees
- Access team and project information
$3
`bash
List all configured servers
grok mcp listTest server connection
grok mcp test server-nameRemove a server
grok mcp remove server-name
`$3
- stdio: Run MCP server as a subprocess (most common)
- http: Connect to HTTP-based MCP server
- sse: Connect via Server-Sent Events
Development
`bash
Install dependencies
bun installDevelopment mode
bun run devBuild project
bun run buildRun linter
bun run lintType check
bun run typecheck
``- Agent: Core command processing and execution logic
- Tools: Text editor and bash tool implementations
- UI: Ink-based terminal interface components
- Types: TypeScript definitions for the entire system
MIT