Chutes Models Plugin for OpenCode - Access 58+ state-of-the-art AI models through the Chutes API
npm install opencode-chutesChutes Models Plugin for OpenCode - Access 58+ state-of-the-art AI models through the Chutes API.
> An OpenCode plugin that integrates Chutes AI models with dynamic synchronization and seamless OpenCode integration.
- Dynamic Model Sync: Automatically fetches latest models from https://llm.chutes.ai/v1/models
- 58+ Models: Access reasoning, coding, vision, and general-purpose models
- Conflict-Free Naming: All models prefixed with chutes/ to avoid conflicts
- Intelligent Caching: Model metadata cached for performance (1 hour TTL)
- Comprehensive Tools: chutes_list_models, chutes_refresh_models, chutes_status
- CLI Support: Install and manage the plugin with bunx opencode-chutes
The plugin provides access to models including:
- chutes/deepseek-ai/DeepSeek-R1 - Advanced reasoning model
- chutes/deepseek-ai/DeepSeek-R1-0528-TEE - Confidential compute reasoning
- chutes/Qwen/Qwen3-235B-A22B-Thinking-2507 - Large-scale thinking model
- chutes/Qwen/Qwen2.5-Coder-32B-Instruct - Specialized code generation
- chutes/mistralai/Devstral-2-123B-Instruct-2512 - Devstral coding model
- chutes/Qwen/Qwen3-Coder-480B-A35B-Instruct-FP8-TEE - Large code model
- chutes/Qwen/Qwen3-VL-235B-A22B-Instruct - Vision-language model
- chutes/unsloth/gemma-3-27b-it - Multimodal Gemma
- chutes/OpenGVLab/InternVL3-78B-TEE - InternVL vision model
- chutes/Qwen/Qwen3-32B - Balanced general model
- chutes/deepseek-ai/DeepSeek-V3 - High-performance general model
- chutes/NousResearch/Hermes-4-405B-FP8-TEE - Large general model
``bash`
bunx opencode-chutes@latest install
This will install the plugin to your project (.opencode/plugin/) or globally based on your preference.
Copy the bundled plugin to your OpenCode plugin directory:
`bashProject-level
cp dist/bundle.js /path/to/your/project/.opencode/plugin/opencode-chutes.js
$3
`bash
npm install opencode-chutes
`Then add to your
opencode.json:`json
{
"plugin": ["opencode-chutes"]
}
`Configuration
$3
Create or edit
opencode.json in your project:`json
{
"plugin": ["opencode-chutes"],
"provider": {
"chutes": {
"options": {
"apiKey": "your-chutes-api-token"
}
}
}
}
`$3
Use OpenCode's built-in
/connect command to securely store your Chutes API token:`
/connect chutes
`Follow the prompts to enter your Chutes API token. The token will be securely stored in
~/.local/share/opencode/auth.json.Alternatively, you can manually create the auth file:
`json
{
"chutes": {
"type": "api",
"key": "your-chutes-api-token-here"
}
}
`You can get your API token from chutes.ai.
Usage
$3
#### List Available Models
`typescript
// List all models
await chutes_list_models({});// Filter by provider
await chutes_list_models({
owned_by: 'Qwen',
});
// Filter by feature
await chutes_list_models({
feature: 'reasoning',
});
// Filter by name
await chutes_list_models({
filter: 'DeepSeek',
});
// Show pricing
await chutes_list_models({
show_pricing: true,
});
`#### Refresh Model List
`typescript
// Refresh from API
await chutes_refresh_models({});// Force refresh even if cache is valid
await chutes_refresh_models({
force: true,
});
`#### Check Plugin Status
`typescript
// Check cache status and model count
await chutes_status();
`$3
The plugin includes a CLI for management:
`bash
Install the plugin
bunx opencode-chutes installCheck plugin status
bunx opencode-chutes statusList available models
bunx opencode-chutes listRefresh model cache
bunx opencode-chutes refreshHealth check
bunx opencode-chutes doctor
`Model Pricing
Models are priced per 1 million tokens. Example pricing:
| Model | Input ($/1M) | Output ($/1M) |
| -------------------------------- | ------------ | ------------- |
|
chutes/Qwen/Qwen3-32B | $0.08 | $0.24 |
| chutes/deepseek-ai/DeepSeek-R1 | $0.30 | $1.20 |
| chutes/unsloth/gemma-3-4b-it | $0.01 | $0.03 |Full pricing is available using the
chutes_list_models tool.Architecture
`
src/
├── index.ts # Main plugin entry point (ChutesPlugin)
├── cli/ # CLI commands
│ ├── index.ts # CAC CLI entry point
│ ├── install.ts # install command
│ ├── status.ts # status command
│ ├── list.ts # list models command
│ ├── refresh.ts # refresh cache command
│ └── doctor.ts # health check command
├── models/
│ ├── index.ts # ModelFetcher exports
│ ├── types.ts # TypeScript types
│ ├── fetcher.ts # Model fetching with retry
│ ├── cache.ts # Model caching (TTL-based)
│ └── registry.ts # Model registry
├── tools/
│ └── index.ts # Plugin tools (3 tools)
└── config/
├── index.ts # Config exports
├── schema.ts # Config validation
└── auth.ts # Auth file reading
`Development
$3
`bash
Build for npm publishing (compiled, not bundled)
bun run buildBuild CLI
bun run build:cliBuild bundle for local installation
bun run build:bundleBuild everything
bun run build && bun run build:cli && bun run build:bundle
`$3
`bash
bun test
`$3
`bash
bun run lint
`$3
`bash
bun run format
`Publishing
$3
`bash
Bump version (patch, minor, or major)
npm version patch # 0.1.0 -> 0.1.1
npm version minor # 0.1.0 -> 0.2.0
npm version major # 0.1.0 -> 1.0.0
`$3
`bash
Build first
bun run build && bun run build:cliLogin to npm (first time only)
npm loginPublish the package
npm publish
`$3
`bash
Create a beta release
npm version prerelease --preid=beta
npm publish --tag beta
`Troubleshooting
$3
Ensure you've connected your Chutes API token using the
/connect chutes command or created ~/.local/share/opencode/auth.json with your token.$3
Use
chutes_list_models to see available models. Model IDs must be prefixed with chutes/.$3
Wait a moment and retry. Consider reducing request frequency.
$3
1. Check your API token is valid
2. Run
chutes_refresh_models({ force: true })
3. Check plugin status with chutes_statusContributing
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Run tests and linting
5. Submit a pull request
License
MIT License. See the LICENSE file for details.
Author
Mark182
Repository
https://github.com/mark182es/opencode-chutes
Chutes API
- Models API:
https://llm.chutes.ai/v1/models`