CLI tool to configure custom models for Factory CLI BYOK
npm install byok-cli```
╔═══════════════════════════════════════════════════════════════╗
║ ║
║ ██████╗ ██╗ ██╗ ██████╗ ██╗ ██╗ ██████╗██╗ ██╗ ║
║ ██╔══██╗╚██╗ ██╔╝██╔═══██╗██║ ██╔╝ ██╔════╝██║ ██║ ║
║ ██████╔╝ ╚████╔╝ ██║ ██║█████╔╝ ██║ ██║ ██║ ║
║ ██╔══██╗ ╚██╔╝ ██║ ██║██╔═██╗ ██║ ██║ ██║ ║
║ ██████╔╝ ██║ ╚██████╔╝██║ ██╗ ╚██████╗███████╗██║ ║
║ ╚═════╝ ╚═╝ ╚═════╝ ╚═╝ ╚═╝ ╚═════╝╚══════╝╚═╝ ║
║ ║
║ Factory CLI - Custom Model Configuration ║
║ ║
╚═══════════════════════════════════════════════════════════════╝
A beautiful interactive CLI tool to configure custom models for Factory CLI BYOK (Bring Your Own Key).
- 13+ Built-in Providers - OpenRouter, DeepInfra, Fireworks, Groq, Ollama, Google Gemini, Hugging Face, Baseten, Anthropic, OpenAI, and more
- OpenAI & Anthropic Compatible - Add any OpenAI-compatible or Anthropic-compatible API endpoint
- Saved Providers - Save and manage providers with API keys for quick access
- Live Model Search - Type to filter models with real-time search
- Multi-select Models - Add multiple models at once with Ctrl+A to select all
- Smart Name Detection - Fetches display names from models.dev API
- URL Validation - Ensures OpenAI-compatible URLs end with /v1
- Step-by-Step Wizard - Beautiful pink-themed UI with progress indicator
- Go Back Navigation - Press ESC to navigate back at any step
First, install Factory CLI:
`bash`
curl -fsSL https://app.factory.ai/cli | sh
Then install BYOK CLI:
`bash`
npm install -g byok-cli
`bash`
byok-cli
`bash`
git clone https://github.com/Kartvya69/BYOK-CLI.git
cd BYOK-CLI
npm install
npm start
| Key | Action |
|-----|--------|
| Enter | Continue to next step |ESC
| | Go back to previous step |Space
| | Toggle model selection |Ctrl+A
| | Select all visible models |Ctrl+D
| | Deselect all models |Ctrl+K
| | Clear search |↑↓
| | Navigate list |←→
| | Select Yes/No in confirmation |J
| | Toggle JSON config preview |
| Provider | Type | Base URL |
|----------|------|----------|
| OpenAI Compatible | generic-chat-completion-api | Custom URL (must end with /v1) |anthropic
| Anthropic Compatible | | Custom URL |generic-chat-completion-api
| OpenRouter | | https://openrouter.ai/api/v1 |generic-chat-completion-api
| DeepInfra | | https://api.deepinfra.com/v1/openai |generic-chat-completion-api
| Fireworks AI | | https://api.fireworks.ai/inference/v1 |generic-chat-completion-api
| Groq | | https://api.groq.com/openai/v1 |generic-chat-completion-api
| Ollama (Local) | | http://localhost:11434/v1 |generic-chat-completion-api
| Google Gemini | | https://generativelanguage.googleapis.com/v1beta/ |generic-chat-completion-api
| Hugging Face | | https://router.huggingface.co/v1 |generic-chat-completion-api
| Baseten | | https://inference.baseten.co/v1 |anthropic
| Anthropic | | https://api.anthropic.com |openai
| OpenAI | | https://api.openai.com/v1 |
| File | Purpose |
|------|---------|
| ~/.factory/settings.json | Factory CLI settings (models for Droid) |~/.byok-cli/providers.json
| | Saved providers with API keys |~/.byok-cli/models.json
| | Tracked models with full configuration |
💖 Xreatlabs (saved)
───────────────
⭐ OpenAI Compatible (Custom URL)
⭐ Anthropic Compatible (Custom URL)
⭐ OpenRouter
...
`$3
Search and multi-select models with keyboard shortcuts:
`
🔍 Type to search models...
3 of 150 selected (showing 10 of 150)☑️ Claude Sonnet 4.5 (claude-sonnet-4-5-20250929)
☑️ GPT-4 Turbo (gpt-4-turbo)
☐ Gemini Pro (gemini-pro)
...
Space: toggle • Ctrl+A: select all • Ctrl+D: deselect
`$3
Review before saving with JSON preview option:
`
📋 Configuration Summary:
╭─────────────────────────────────────────╮
│ 🏢 Provider: Xreatlabs │
│ 🔗 Base URL: https://api.xreatlabs.space/v1 │
│ 🤖 Models: Claude Sonnet 4.5, GPT-4 │
│ ⚙️ Max Tokens: 16384 │
│ ⚙️ Images: Yes │
╰─────────────────────────────────────────╯
`Changelog
$3
- New: Saved providers now appear in dropdown for quick selection
- New: URL validation requires /v1 suffix for OpenAI-compatible APIs
- New: Navigation hints with arrow indicators
- New: Manual model entry now fetches display names from models.dev
- Fixed: Models now properly save to ~/.byok-cli/models.json`MIT