AI-powered Kubernetes debugging agent with terminal UI
npm install triagentAI-powered Kubernetes debugging agent with terminal UI.
``bashbun
bun install -g triagent
Usage
`bash
Run interactive TUI
triagentRun webhook server only
triagent --webhook-onlyDirect incident investigation
triagent --incident "API gateway returning 503 errors"
`Execution Modes
Triagent supports three execution modes for running commands:
$3
Commands run inside a Docker container via Bashlet. This provides isolation and safety - the agent cannot accidentally modify your host system.
`bash
triagent
`Codebases are mounted at
/workspace/ and kubeconfig at /root/.kube.$3
Commands run directly on your local machine. Use this when you need access to tools not available in the sandbox.
`bash
triagent --host
`$3
Commands run on a remote server via SSH. This is useful for connecting to a Docker container or VM with pre-installed debugging tools.
`bash
triagent --remote user@host
triagent -r root@debug-container.local
`On startup, triagent creates a session workspace on the remote:
`
š Running in remote mode: root@debug-container.local
Workspace: /tmp/triagent-a3b4c5d6 (session: a3b4c5d6)
`Requirements:
- SSH key-based authentication (no password prompts)
- The remote must have the necessary CLI tools (kubectl, etc.)
Example: Debug container setup
`bash
Run a container with SSH and debugging tools
docker run -d --name debug-tools \
-p 2222:22 \
-v ~/.kube:/root/.kube:ro \
your-debug-image:latestAdd to ~/.ssh/config for easy access
Host debug-tools
HostName localhost
Port 2222
User root
Connect triagent to it
triagent --remote debug-tools
`Configuration
Configuration can be set via CLI commands or environment variables. CLI config takes precedence over environment variables.
$3
`bash
Set configuration values
triagent config set Get a configuration value
triagent config get List all configuration values
triagent config listShow config file path
triagent config path
`$3
| Key | Description | Default |
|-----|-------------|---------|
|
aiProvider | AI provider (openai, anthropic, google) | anthropic |
| aiModel | Model ID (e.g., gpt-4o, claude-sonnet-4-20250514) | Provider default |
| apiKey | API key for the provider | - |
| baseUrl | Custom API base URL (for proxies or local models) | - |
| webhookPort | Webhook server port | 3000 |
| codebasePath | Path to single codebase (legacy) | ./ |
| kubeConfigPath | Kubernetes config path | ~/.kube |$3
For applications spanning multiple repositories, configure
codebasePaths in ~/.config/triagent/config.json:`json
{
"codebasePaths": [
{ "name": "frontend", "path": "/path/to/frontend-repo" },
{ "name": "backend", "path": "/path/to/backend-repo" },
{ "name": "infra", "path": "/path/to/infrastructure" }
]
}
`Each codebase is mounted at
/workspace/ in the sandbox. The model can access any codebase as needed during investigation.$3
Create
~/.config/triagent/TRIAGENT.md to provide custom instructions to the model. These instructions are prepended to the default system prompt.Example
TRIAGENT.md:`markdown
Project Context
This is a microservices e-commerce platform with the following services:
- frontend: Next.js app in /workspace/frontend
- api: Go backend in /workspace/backend
- infra: Terraform configs in /workspace/infra
Investigation Priorities
1. Always check the api service logs first for 5xx errors
2. The frontend service talks to api via internal DNS: api.default.svc.cluster.local
3. Common issues: Redis connection timeouts, PostgreSQL connection pool exhaustion
`$3
| Variable | Description |
|----------|-------------|
|
AI_PROVIDER | AI provider (openai, anthropic, google) |
| AI_MODEL | Model ID |
| AI_BASE_URL | Custom API base URL |
| OPENAI_API_KEY | OpenAI API key |
| ANTHROPIC_API_KEY | Anthropic API key |
| GOOGLE_GENERATIVE_AI_API_KEY | Google AI API key |
| WEBHOOK_PORT | Webhook server port |
| CODEBASE_PATH | Path to codebase |
| KUBE_CONFIG_PATH | Kubernetes config path |$3
`bash
Configure with Anthropic (default)
triagent config set apiKey sk-ant-...Configure with OpenAI
triagent config set aiProvider openai
triagent config set apiKey sk-proj-...Use a custom API endpoint (e.g., proxy or local model)
triagent config set baseUrl https://your-proxy.example.com/v1
`Development
`bash
Install dependencies
bun installRun in development mode
bun run devBuild
bun run buildType check
bun run typecheck
``MIT