Autonomous Local AI Agent. Vibe Coding v1.0.
npm install indirecttek-vibe-engine> "Autonomous Local AI Agent. Maximum Badassery."
IndirectTek Vibe Engine turns your VS Code into an autonomous development powerhouse. It writes code, creates files, and executes commands—running entirely on your infrastructure.
---
Prerequisites:
- Ollama installed and running (http://localhost:11434).
- Model: qwen2.5-coder:14b (Recommended). Pull it via ollama pull qwen2.5-coder:14b.
192.168...). Change partnerBot.ollamaUrl to http://localhost:11434 (or your server IP) if running locally._Note: The agent connects to http://localhost:11434 by default._
---
For Telemetry, Dashboards, and Enterprise Features, run the standalone Vibe Controller:
1. Run: npx indirecttek-vibe-engine@latest
2. Copy the Token.
3. In VS Code Settings:
- Set Use Controller to true.
- Paste the Controller Token.
---
run_command requires manual allow (Always/Once/Deny).generate_image triggers approval.create_file, edit_file, delete_file check permissions.LIST_DIR) to understand your project structure instantly.generate_image (requires Fooocus API).settings.json:``json`
"partnerBot.models.fast": "qwen2.5:7b", // Chat, Explanations
"partnerBot.models.default": "qwen2.5-coder:14b", // Refactoring, Editing
"partnerBot.models.deep": "qwen2.5:32b", // Planning, Complex Logic
"partnerBot.allowedCommands": ["npm install", "cd", "ls", "grep"], // Whitelist
"partnerBot.allowDangerousCommands": false, // Override hard-deny list (rm/sudo/etc)
"partnerBot.toolEnforcement": "soft" // off | soft | strict
The agent will automatically route "Fix this" requests to the Default (Coder) model and casual chat to the Fast model.
is correct.
- Connection Refused: Check your ollamaUrl`. If using WSL or Docker, use the host IP.---