MCP server for AI-powered feature planning and codebase analysis. Generates PRDs, technical blueprints, and actionable tasks.
npm install plan-master-mcp
From Idea to Production without Context Loss
Give AI agents perfect understanding of your tech stack, architecture and established patterns.
Build complex features 10X more productive.
---
Plan Master is a Context Engineering tool designed to help AI agents plan, analyze, and execute development tasks. This is the Node.js MCP Client that connects your IDE to the Plan Master Backend.
1. MCP Client (This Repo):
- Runs locally in your IDE (Node.js).
- Reads your local codebase context securely.
- Sends anonymized/filtered context to the backend.
- Open Source.
2. Plan Master Backend (Separate Service):
- Holds the "Secret Sauce" prompts and AI Logic (Gemini 2.5 Pro, Claude 4.5, GPT-5.1).
- Hosted / Private.
- Production URL: https://plan-master-backend.onrender.com
---
Having issues? Check our Troubleshooting Guide for common problems and solutions.
---
Backend Status: Live at https://plan-master-backend.onrender.com
> Getting an API Key:
> Plan Master requires an API key to access the backend. Contact us at [your-email@example.com] or open an issue on GitHub to request access.
#### Cursor
1. Open ~/.cursor/mcp.json
2. Add:
``json`
{
"mcpServers": {
"plan-master": {
"command": "npx",
"args": ["-y", "plan-master-mcp"],
"env": {
"PLAN_MASTER_API_KEY": "your-api-key-here"
}
}
}
}
#### ✅ Windsurf
1. Open ~/.codeium/windsurf/mcp_config.json`
2. Add:json`
{
"mcpServers": {
"plan-master": {
"command": "npx",
"args": ["-y", "plan-master-mcp"],
"env": {
"PLAN_MASTER_API_KEY": "your-api-key-here"
}
}
}
}
#### Claude Code
Run the following command:
`bash`
claude mcp add plan-master \
--command npx \
--args "-y plan-master-mcp" \
--env PLAN_MASTER_API_KEY=your-api-key-here
---
1. Initialize: Call init_project to scan your codebase and build semantic search indexdiscover_feature_context
2. Discover Context: Call with your feature request to find relevant existing codeplan_feature
3. Plan with Context: Call and pass the discovered context into additional_context parameternext_task
4. Implement Tasks: Use tool to systematically work through the generated tasksupdate_changelog
5. Document: Call when feature is complete
1. "Plan a new feature...": You ask your IDE agent.
2. Discover (RECOMMENDED): Agent calls discover_feature_context to find relevant files using semantic search.planning/
3. Analyze: The MCP client scans your folder structure.
4. Clarify (if needed): The AI asks clarifying questions if your feature request needs more details.
5. Plan: It sends context (including discovered files) to the backend, which uses Gemini 2.5 Pro to generate a PRD.
6. Blueprint: A technical architect agent creates a blueprint with Mermaid architecture diagrams.
7. Tasks: A lead engineer agent breaks it down into tasks.
8. Result: You get a folder with prd.md, technical_blueprint.md, and tasks.md.next_task
9. Implementation: Follow tasks using tool to track progress.
The technical blueprint includes two Mermaid graphs:
1. Current Architecture - Shows your project's current state (relevant files, components, dependencies)
2. Target Architecture - Shows how the system will look after implementing the feature
This helps you:
- Understand what files/components will be affected
- See what new files/components will be created
- Visualize how dependencies will change
- Ensure the implementation won't break existing structure
To view Mermaid graphs: Install the Markdown Preview Mermaid Support extension in your IDE, then open the markdown preview.
Before generating a full feature plan, the AI will:
- Analyze your feature request and codebase
- Ask 3-7 targeted clarifying questions if needed
- Skip questions for simple/clear features
- Help you think through scope, UX, technical decisions, and constraints
This ensures better planning and reduces back-and-forth iterations!
---
Plan Master now includes local intelligence tools that run entirely within your MCP server (no backend required for these), using a local SQLite database and embeddings.
- Usage: "Call discover_feature_context with your request to find relevant files BEFORE calling plan_feature.".plan-master/plan_master.db
- How it works: Uses OpenAI embeddings (generated by backend, stored locally in ) to find semantically similar code snippets.plan_feature
- Why it matters: This grounds the planning in your actual codebase structure, preventing hallucinations and ensuring the plan aligns with existing patterns.
- Next Step: The tool output explicitly tells you to pass the results into 's additional_context parameter.
and track progress systematically.- Usage: "Call
next_task to manage implementation workflow."
- Required Workflow:
1. Call next_task with action='get_next' to see what to work on
2. Call next_task with action='mark_in_progress' and task_id=X when starting
3. Implement the task
4. Call next_task with action='mark_done' and task_id=X when finished
5. Repeat for all tasks⚠️ Important: Always use this tool during feature implementation to keep
tasks.md updated and track progress. The tool will guide you through each step with clear instructions.$3
Goal: Allow projects to define architecture / product rules that the agent should respect.- Usage: "Call
load_constraints before starting a major plan."
- Configuration: Create planning/constraints.md or .planmaster.json to define your rules.$3
Goal: Maintain a project history.- Usage: "Call
update_changelog after implementing a feature."---
🛠️ Development
`bash
Run in dev mode
npm run devRun tests
npm test
``See TEST_README.md for more information about the test suite.