Per-provider proxy configuration for OpenCode - Route different AI providers through different proxies
npm install opencode-proxy

Per-provider proxy configuration for OpenCode.
``json`
{
"google": "http://127.0.0.1:20171",
"openai": "socks5://127.0.0.1:1080"
}
Unconfigured providers connect directly — no direct array needed.
---
Option A: Let an LLM do it
Paste this into any LLM agent:
``
Install and configure opencode-proxy by following: https://raw.githubusercontent.com/Chongyao/opencode-proxy/main/README.md
Option B: Manual
`bash`
npm install opencode-proxy
Add to ~/.config/opencode/opencode.json:
`json`
{
"plugin": ["opencode-proxy"]
}
Create ~/.config/opencode/proxy.json:
`json`
{
"google": "http://127.0.0.1:20171"
}
Done.
Fetch this README and follow the configuration section. Key points:
- Install opencode-proxy npm package"opencode-proxy"
- Add to the plugin array in ~/.config/opencode/opencode.json~/.config/opencode/proxy.json
- Create with provider-to-proxy URL mappingdirect
- Unconfigured providers connect directly (no explicit list needed)
---
File: ~/.config/opencode/proxy.json
`json`
{
"debug": true,
"google": "http://127.0.0.1:20171",
"anthropic": "socks5://user:pass@127.0.0.1:1080",
"openai": "https://proxy.example.com:8443"
}
| Key | Type | Description |
|-----|------|-------------|
| debug | boolean | Enable debug logging |
| | string | Proxy URL: protocol://[user:pass@]host:port |
- http://host:porthttps://host:port
- socks5://host:port
- socks4://host:port
-
google, anthropic, openai, azure, amazon-bedrock, moonshot, kimi, deepseek, groq, mistral, cohere, together, perplexity, openrouter, github-copilot, xai, cerebras, fireworks
---
Single provider:
`json`
{
"google": "http://127.0.0.1:20171"
}
With auth:
`json`
{
"google": "http://user:pass@proxy.com:8080"
}
Debug mode:
`json`
{
"debug": true,
"google": "http://127.0.0.1:20171"
}
---
The plugin patches fetch` to route configured providers through their specified proxy. Unconfigured providers connect directly.
---
MIT