Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
npm install docmole
Dig through any documentation with AI




Docmole is an MCP server that lets you query any documentation site from AI assistants like Claude, Cursor, or any MCP-compatible client. The mole digs through docs so you don't have to.
* π Universal docs support β works with any documentation site
* π Self-hosted RAG β LanceDB vectors + OpenAI embeddings, no Python needed
* β‘ Zero-setup mode β instant access to Mintlify-powered sites
* π§ Multi-turn conversations β remembers context across questions
* π WebFetch compatible β links converted to absolute URLs
* π MCP native β works with Claude, Cursor, and any MCP client
* π¦ Ollama support β fully local mode, no API keys needed
* π Generic HTML extraction β support for non-Mintlify documentation sites
* π Incremental updates β only re-index changed pages
To use Docmole, run it directly with bunx (no install needed):
``bash`
bunx docmole --help
Or install globally:
`bash`
bun install -g docmole
Works on macOS, Linux and Windows. Requires Bun runtime.
Index and query any documentation site. Requires OPENAI_API_KEY.
`bashOne-time setup β discovers pages and builds vector index
bunx docmole setup --url https://docs.example.com --id my-docs
Add to your MCP client:
`json
{
"mcpServers": {
"my-docs": {
"command": "bunx",
"args": ["docmole", "serve", "--project", "my-docs"]
}
}
}
`$3
For sites with Mintlify AI Assistant β no API key needed:
`bash
bunx docmole -p agno-v2
``json
{
"mcpServers": {
"agno-docs": {
"command": "bunx",
"args": ["docmole", "-p", "agno-v2"]
}
}
}
`CLI
Docmole has a built-in CLI for all operations:
`bash
Mintlify mode (proxy to Mintlify API)
docmole -p Local RAG mode
docmole setup --url --id
docmole serve --project
docmole list
docmole stop --project
`Run
docmole --help for all options.How it works
`
βββββββββββββββ βββββββββββββββ ββββββββββββββββββββββββ
β MCP Client ββββββΆβ Docmole ββββββΆβ Embedded: LanceDB β
β (Claude, βββββββ MCP Server βββββββ Mintlify: API proxy β
β Cursor...) β βββββββββββββββ ββββββββββββββββββββββββ
βββββββββββββββ
`Local RAG Mode: Crawls documentation, generates embeddings with OpenAI, stores in LanceDB. Hybrid search combines semantic and keyword matching.
Mintlify Mode: Proxies requests to Mintlify's AI Assistant API. Zero setup, instant results.
Known Mintlify Project IDs
| Documentation | Project ID |
|--------------|------------|
| Agno |
agno-v2 |
| Resend | resend |
| Mintlify | mintlify |
| Vercel | vercel |
| Upstash | upstash |
| Plain | plain |> Find more: Open DevTools β Network tab β use the AI assistant β look for
leaves.mintlify.com/api/assistant/{project-id}/messageConfiguration
| Environment Variable | Default | Description |
|---------------------|---------|-------------|
|
OPENAI_API_KEY | β | Required for local RAG mode |
| DOCMOLE_DATA_DIR | ~/.docmole | Data directory for projects |$3
`
~/.docmole/
βββ projects/
β βββ /
β βββ config.yaml # Project configuration
β βββ lancedb/ # Vector database
βββ global.yaml # Global settings
``See AGENT.md for detailed documentation including:
- Architecture details
- Backend implementations
- Enterprise deployment guides
PRs welcome! See the contributing guide for details.
- Mintlify for amazing documentation tooling
- Anthropic for Claude and the MCP protocol
- LanceDB for the vector database
The Docmole codebase is under MIT license.