MCP server for Hivemind - Multi-model AI consensus for Claude Code
npm install @quantulabs/hivemind

Multi-model AI consensus platform that queries GPT-5.2, Claude Opus 4.5, and Gemini 3 Pro simultaneously to deliver synthesized, high-confidence responses.
---
Use Hivemind directly in Claude Code to get perspectives from GPT-5.2 and Gemini 3 Pro. Claude acts as the orchestrator and synthesizes the responses.
- Node.js >= 18
- Claude Code CLI installed
- At least one API key: OpenAI or Google AI
> Note: No Anthropic API key needed - Claude is already your host!
``bash`
npm install -g @quantulabs/hivemind
claude mcp add hivemind -- hivemind
You need at least one API key, but both are recommended for better consensus:
- OpenAI (GPT-5.2)
- Google AI (Gemini 3 Pro)
Option 1: Paste directly (recommended)
`bash`
/hive-config sk-proj-xxx... # OpenAI key
/hive-config AIzaSy... # Google key
Option 2: Config file
Create ~/.config/hivemind/.env:`bash`
OPENAI_API_KEY=sk-...
GOOGLE_API_KEY=AIza...
> A .env.example template is included in the package.
Using with other MCP clients (non-Claude Code)
For standalone MCP usage, you can also add an Anthropic key to include Claude in the consensus:
`bash`
ANTHROPIC_API_KEY=sk-ant-.../hive-config
Disable Claude Code mode via > Settings > Claude Code Mode.
`bash`
/hive "Why is my WebSocket connection dropping?"
Claude orchestrates the consensus from GPT-5.2 and Gemini 3 Pro responses.
| Tool | Description |
|------|-------------|
| hivemind | Query models and get synthesized consensus |configure_keys
| | Set API keys (stored securely) |check_status
| | Check configuration and active providers |configure_hive
| | Toggle grounding search and settings |check_stats
| | View token usage and cost statistics |
- /hive - Orchestrate multi-model consensus with Claude as the synthesizer/hive-config
- - Configure API keys and settings/hivestats
- - View usage statistics
Copy CLAUDE.md.example to your project's .claude/CLAUDE.md to enable automatic Hivemind consultation when Claude is stuck (after 3+ failed attempts).
All providers use optimized caching for cost reduction on follow-up queries:
| Provider | Type | Savings | Min Tokens |
|----------|------|---------|------------|
| OpenAI | Automatic | 50% | 1024 |
| Gemini 2.5+ | Implicit | 90% | - |
| Anthropic | Explicit | 90% | 1024 |
---
A full-featured web app with solo mode, hivemind mode, and conversation history.
`bashClone the repository
git clone https://github.com/QuantuLabs/hivemind.git
cd hivemind
Open
http://localhost:3000, click the settings icon, and enter your API keys.$3
- Multi-Model Consensus: Query 3 leading AI models simultaneously
- Deliberation Algorithm: Up to 3 rounds of refinement to reach consensus
- Solo Mode: Chat with individual models (GPT, Claude, Gemini)
- Hivemind Mode: Get synthesized responses from all models
- Conversation History: Persistent chat sessions
- Dark/Light Theme: Full theme support
- Secure Storage: API keys encrypted with AES-GCM in browser
$3
- API keys are encrypted using AES-GCM with PBKDF2 key derivation
- Keys are stored locally in browser localStorage (never sent to servers)
- Session persistence uses sessionStorage (cleared on browser close)
---
How Consensus Works
1. Initial Query: All 3 models receive the same question
2. Analysis: An orchestrator analyzes responses for agreements/divergences
3. Refinement: If no consensus, models see other perspectives and refine (up to 3 rounds)
4. Synthesis: Final response synthesizes agreed points and addresses divergences
Supported Models
$3
- GPT-5.2 (default)
- GPT-5.1, GPT-5, GPT-5 Mini, GPT-5 Nano
- O4 Mini$3
- Claude Opus 4.5 (default)
- Claude Sonnet 4.5, Claude Opus 4, Claude Sonnet 4$3
- Gemini 3 Pro (default)
- Gemini 3 Flash, Gemini 2.5 Pro/Flash/Flash Lite, Gemini 2.0 FlashProject Structure
`
hivemind/
├── apps/
│ └── web/ # Next.js 14 frontend
├── packages/
│ ├── core/ # Shared consensus logic & providers
│ └── mcp/ # Model Context Protocol server
└── .claude/ # Claude Code integration
`Development
`bash
Run all tests
bun testRun tests with coverage
bun test:coverageBuild all packages
bun buildLint code
bun lint
``MIT
---
Developed by QuantuLabs