Local Memory MCP Server - AI-powered persistent memory system for Claude Code, Claude Desktop, Gemini, Codex, OpenCode and other MCP-compatible tools
npm install local-memory-mcpVersion 1.4.1 - AI-powered persistent memory system for developers working with Claude Code, Claude Desktop, Gemini, Codex, Cursor, and other coding agents.
Transform your AI interactions with intelligent memory that grows smarter over time. Build persistent knowledge bases, track learning progression, and maintain context across conversations. Local Memory provides MCP, REST API, and Command-line interfaces for you and your coding agents to store, retrieve, discover, and analyze memories to give your agent the right context for it's work and tasks.
- One-Command Install: npm install -g local-memory-mcp
- Smart Memory: AI-powered categorization and relationship discovery
- Lightning Fast: 10-57ms search responses with semantic similarity
- MCP Integration: Works seamlessly with Claude Desktop and other AI tools
- REST API: Easily integrate into your applications and workflows
- Command-Line Interface: Enable script automation and code execution via shell scripts and commands
- Persistent Context: Never lose conversation context again
- Enterprise Ready: Commercial licensing with security hardening
- Multi-Provider AI: Mix and match AI backends - Ollama, OpenAI, Anthropic, claude CLI, or any OpenAI-compatible server
bash
npm install -g local-memory-mcp
`$3
`bash
Get your license key from https://localmemory.co
local-memory license activate LM-XXXX-XXXX-XXXX-XXXX-XXXXVerify activation
local-memory license status
`$3
`bash
Start the daemon
local-memory startVerify it's running
local-memory status
`CLI Usage
$3
`bash
Store memories with automatic AI categorization
local-memory remember "Go channels are like pipes between goroutines"
local-memory remember "Redis is excellent for caching" --importance 8 --tags caching,databaseSearch memories (keyword and AI-powered semantic search)
local-memory search "concurrency patterns"
local-memory search "neural networks" --use_ai --limit 5Create relationships between concepts
local-memory relate "channels" to "goroutines" --type enablesDelete memories
local-memory forget
`$3
`bash
Tag-based search
local-memory search --tags python,programmingDate range search
local-memory search --start_date 2025-01-01 --end_date 2025-12-31Domain filtering
local-memory search "machine learning" --domain ai-researchSession filtering (cross-conversation access)
local-memory search "recent work" --session_filter_mode all
`MCP Integration
$3
Add to your Claude Desktop configuration file:
`json
{
"mcpServers": {
"local-memory": {
"command": "/path/to/local-memory",
"args": [
"--mcp"
],
"transport": "stdio"
}
}
}
`$3
When configured, Claude will have access to these memory tools:
Core Tools:
-
observe - Record observations with knowledge level support (L0-L3)
- search - Find memories with semantic search
- bootstrap - Initialize sessions with knowledge contextKnowledge Evolution:
-
reflect - Process observations into learnings
- evolve - Validate, promote, or decay knowledge
- question - Track epistemic gaps and contradictions
- resolve - Resolve contradictions and answer questionsReasoning Tools:
-
predict - Generate predictions from patterns
- explain - Trace causal paths between states
- counterfactual - Explore "what if" scenarios
- validate - Check knowledge graph integrityRelationship Tools:
-
relate - Create relationships between memories
- Plus memory management tools (update, delete, get)REST API
When the daemon is running, REST API is available at
http://localhost:3002:`bash
Search memories
curl "http://localhost:3002/api/memories/search?query=python&limit=5"Store new memory
curl -X POST "http://localhost:3002/api/memories" \
-H "Content-Type: application/json" \
-d '{"content": "Python dict comprehensions are powerful", "importance": 7}'Health check
curl "http://localhost:3002/api/v1/health"
`Service Management
`bash
Start daemon
local-memory startCheck daemon status
local-memory statusStop daemon
local-memory stopView running processes
local-memory psKill all processes (if needed)
local-memory kill_allSystem diagnostics
local-memory doctor
`What's New in v1.4.1
- Multi-Provider AI Backend - Split architecture for mixing AI providers independently for embeddings and chat. Fallback chains and circuit breakers for resilience.
| Provider | Embeddings | Chat | Type |
|----------|------------|------|------|
| Ollama | Yes | Yes | Local (default) |
| OpenAI Compatible | Yes | Yes | Local/Remote |
| OpenAI | Yes | Yes | Cloud |
| Anthropic | No | Yes | Cloud |
| claude CLI | No | Yes | Local subprocess |
- Agent Attribution - Auto-detect agent type (claude-desktop, claude-code, api) with hostname tracking for multi-device attribution
- Default Domain with MCP Prompts - Configurable
session.default_domain with intelligent domain cascade from CLAUDE.md, AGENTS.md, and GEMINI.md; new MCP prompts/list and prompts/get protocol support
- Security Hardening - 6 fixes including API key redaction, SSRF protection, command injection prevention, and secure file permissions
- Bug Fixes - ResponseFormat validation for "intelligent" format; enhanced text search SQL error fix$3
`yaml
Recommended: Local embeddings + Claude reasoning
ai_provider:
embedding_provider: "ollama"
chat_provider: "anthropic"
chat_fallback: "ollama" anthropic:
enabled: true
api_key: "sk-ant-xxxxx"
model: "claude-sonnet-4-20250514"
`See the full configuration guide for all provider options.
System Requirements
- RAM: 512MB minimum
- Storage: 100MB for binaries, variable for database
- Platforms: macOS (Intel/ARM64), Linux x64, Windows x64
Licensing
$3
Local Memory requires a commercial license for all use cases. Get your license at localmemory.co.$3
`bash
Activate license
local-memory license activate LM-XXXX-XXXX-XXXX-XXXX-XXXXCheck status
local-memory license status
`Support
- Documentation: localmemory.co/docs
- Architecture: localmemory.co/architecture
- Prompts: localmemory.co/prompts
- Support: Local Memory Discord
- Releases Repo: Local Memory Releases
---
Transform your AI workflow with persistent memory. Install now and never lose context again.
`bash
npm install -g local-memory-mcp
``