Recursive RAG MCP server for Cursor IDE with dashboard and rules optimizer




Recursive RAG MCP server for Cursor IDE with interactive setup wizard, web dashboard, and AI-powered rules optimizer. Build a knowledge base from your documentation and codebase, enabling multi-hop retrieval, iterative query refinement, and intelligent rule management.
> Status: Beta - Core features stable, actively maintained
``bashInstall globally
npm install -g cursor-recursive-rag
Installation
$3
`bash
npm install -g cursor-recursive-rag
`$3
`bash
git clone https://github.com/garethdaine/cursor-recursive-rag.git
cd cursor-recursive-rag
npm install
npm run build
npm link
`CLI Commands
$3
`bash
cursor-rag setup # Interactive configuration wizard
cursor-rag status # Show configuration and statistics
cursor-rag dashboard # Start web dashboard (default: http://localhost:3333)
`$3
`bash
cursor-rag ingest https://docs.example.com --crawl --max-pages 100
cursor-rag ingest ./docs # Local directory
cursor-rag ingest ./document.md # Single file
`$3
`bash
cursor-rag search "how to authenticate users"
cursor-rag search "database queries" --top-k 10
`$3
`bash
cursor-rag chat list # List Cursor conversations
cursor-rag chat ingest # Ingest chat history into RAG
cursor-rag chat watch # Watch for new conversations
cursor-rag chat stats # Show ingestion statistics
`$3
`bash
cursor-rag rules list # List all rules
cursor-rag rules analyze # Analyze without changes
cursor-rag rules duplicates # Show duplicates only
cursor-rag rules conflicts # Show conflicts only
cursor-rag rules outdated # Show outdated rules
cursor-rag rules optimize # Full optimization (dry-run)
cursor-rag rules merge # LLM-powered merge
cursor-rag rules rewrite # LLM-powered rewrite
`$3
`bash
cursor-rag maintenance run # Run maintenance job
cursor-rag maintenance start # Start scheduler
cursor-rag maintenance stats # Show statistics
cursor-rag maintenance cleanup # Clean stale data
`Web Dashboard
Start with
cursor-rag dashboard (default: http://localhost:3333)$3
| Tab | Features |
|-----|----------|
| Overview | Stats, connection status, quick actions |
| Search | Query knowledge base with results display |
| MCP Gateway | Browse 87+ tools from connected backends |
| OpenSkills | Browse and search installed skills |
| Tools | Execute built-in tools with forms |
| Activity | Persistent log of all operations |
| Settings | Configure all system options |
$3
The Rules Optimizer panel provides one-click analysis and optimization:
1. Select Folder: Browse or enter path to rules folder
2. Choose Mode: Dry Run (preview) or Apply Changes
3. Run Optimizer: Analyzes duplicates, conflicts, outdated rules
4. Review Results: See all issues with severity indicators
Works with or without LLM:
- Without LLM: Pattern matching detects issues, reports for manual review
- With LLM: Automatically merges duplicates preserving all content
$3
Configure:
- Vector Store: Redis Stack, Redis 8.x, Qdrant, ChromaDB, Memory, Vectorize
- Embeddings: Xenova (local), OpenAI, Ollama
- Proxy: PacketStream, Decodo with credentials
- Rules Analyzer: Thresholds, patterns, LLM provider
- LLM Provider: OpenAI, Anthropic, DeepSeek, Groq, Ollama, OpenRouter
Configuration
Configuration stored in
~/.cursor-rag/config.json:`json
{
"vectorStore": "redis-stack",
"embeddings": "xenova",
"apiKeys": {
"firecrawl": "fc-...",
"redis": { "url": "redis://localhost:6379" }
},
"proxy": { "enabled": false },
"dashboard": { "enabled": true, "port": 3333 },
"mcpGateway": { "enabled": true, "url": "http://localhost:3010" },
"openSkills": { "enabled": true, "autoIngestSkills": true }
}
`$3
| Type | Description | Setup |
|------|-------------|-------|
|
redis-stack | Redis + RediSearch (Docker) | docker run -d -p 6379:6379 redis/redis-stack-server |
| redis | Redis 8.x native vectors | brew install redis |
| qdrant | Qdrant vector database | docker run -d -p 6333:6333 qdrant/qdrant |
| chroma | ChromaDB | docker run -d -p 8000:8000 chromadb/chroma |
| memory | In-memory with file persistence | No setup required |
| vectorize | Cloudflare Vectorize | Requires Cloudflare account |$3
Stored in
~/.cursor-rag/rules-config.json:`json
{
"analysis": {
"duplicateThreshold": 0.7,
"maxAgeDays": 365,
"detectConflicts": true,
"detectOutdated": true,
"useLLM": false
},
"llm": {
"provider": "openai",
"model": "gpt-4o-mini"
},
"versionChecks": [],
"deprecationPatterns": [],
"naturalRules": []
}
`MCP Tools
Available when using Cursor IDE:
| Tool | Description |
|------|-------------|
|
recursive_query | Multi-hop retrieval with query decomposition |
| search_knowledge | Direct vector similarity search |
| ingest_document | Add document (URL, file, text) |
| crawl_and_ingest | Crawl website and index |
| list_sources | List indexed sources |
| chat_ingest | Ingest Cursor chat history |
| chat_list | List conversations |
| memory_stats | Memory system statistics |
| gateway_* | MCP Gateway tools |
| openskills_* | OpenSkills tools |Usage in Cursor
$3
1. Start dashboard:
cursor-rag dashboard
2. In Cursor: @Docs → Add new doc
3. Enter: http://localhost:3333/docs
4. Use: @Docs cursor-recursive-rag in prompts$3
Ask naturally and the AI will use appropriate tools:
`
Search my knowledge base for authentication patterns
Crawl and ingest https://docs.example.com with max 50 pages
What sources are indexed in my RAG?
`Architecture
`
cursor-recursive-rag/
├── src/
│ ├── cli/ # CLI commands
│ ├── server/ # MCP server and tools
│ ├── dashboard/ # Web dashboard
│ ├── adapters/
│ │ ├── vector/ # Vector store adapters
│ │ ├── embeddings/ # Embedding adapters
│ │ └── llm/ # LLM provider adapters
│ ├── services/ # Core services
│ ├── config/ # Configuration schemas
│ └── types/ # TypeScript definitions
├── bin/ # CLI entry point
└── dist/ # Compiled JavaScript
`Requirements
- Node.js >= 20.0.0
- Cursor IDE (for MCP integration)
- Vector store (Docker or Redis 8.x recommended)
- Optional: Ollama (local embeddings), LLM API key (rules optimization)
API Keys
| Service | Purpose | Get Key |
|---------|---------|---------|
| Firecrawl | Web crawling | https://www.firecrawl.dev |
| OpenAI | Embeddings/LLM | https://platform.openai.com |
| Anthropic | LLM | https://console.anthropic.com |
| Qdrant Cloud | Vector store | https://cloud.qdrant.io |
Troubleshooting
$3
Run cursor-rag setup to create configuration.$3
1. Check ~/.cursor/mcp.json has recursive-rag entry
2. Restart Cursor IDE
3. Verify server path is correct$3
Ensure the path is absolute (e.g., /Users/you/.cursor/rules not ~/.cursor/rules)$3
Either disable "Use LLM for Analysis" in Settings, or configure an LLM provider.Development
`bash
npm install # Install dependencies
npm run build # Build TypeScript
npm run dev # Watch mode
npm link # Link for testing
``See CONTRIBUTING.md for guidelines.
MIT