Long-term memory engine for AI agents
npm install @presidio-dev/agent-memory> Fork of OpenMemory with additional features.


agent-memory is a cognitive memory engine for llms and agents.
- š§ real long-term memory (not just embeddings in a table)
- š¾ self-hosted, local-first (sqlite / postgres)
- š pgvector support - 100x faster with database-level user isolation for multi-tenant SaaS
- š§© integrations: mcp, claude desktop, cursor, windsurf
- š„ sources: github, notion, google drive, onedrive, web crawler
- š explainable traces (see why something was recalled)
your model stays stateless. your app stops being amnesiac.
---
``bash`
npm install @presidio-dev/agent-memory
`typescript
import { Memory } from "@presidio-dev/agent-memory"
const mem = new Memory()
await mem.add("user likes spicy food", { user_id: "u1" })
const results = await mem.search("food?", { user_id: "u1" })
`
drop this into:
- node backends
- clis
- local tools
- anything that needs durable memory without running a separate service
that's it. you're now running a fully local cognitive memory engine š
---
ingest data from external sources directly into memory:
`typescript`
const github = await mem.source("github")
await github.connect({ token: "ghp_..." })
await github.ingest_all({ repo: "owner/repo" })
available sources: github, notion, google_drive, google_sheets, google_slides, onedrive, web_crawler
---
ā
local-first - runs entirely on your machine, zero external dependencies
ā
multi-sector memory - episodic, semantic, procedural, emotional, reflective
ā
temporal knowledge graph - time-aware facts with validity periods
ā
memory decay - adaptive forgetting with sector-specific rates
ā
waypoint graph - associative recall paths for better retrieval
ā
explainable traces - see exactly why memories were recalled
ā
pgvector support - 100x faster vector search with database-level user isolation (optional)
ā
zero config - works out of the box with sensible defaults
---
agent-memory automatically classifies content into 5 cognitive sectors:
| sector | description | examples | decay rate |
|--------|-------------|----------|------------|
| episodic | time-bound events & experiences | "yesterday i attended a conference" | medium |
| semantic | timeless facts & knowledge | "paris is the capital of france" | very low |
| procedural | skills, procedures, how-tos | "to deploy: build, test, push" | low |
| emotional | feelings, sentiment, mood | "i'm excited about this project!" | high |
| reflective | meta-cognition, insights | "i learn best through practice" | very low |
---
For optimal performance in multi-user SaaS environments, enable native PostgreSQL pgvector support:
`bash`
OM_USE_PGVECTOR=true
Benefits:
- 100x faster vector similarity search with HNSW indexes
- Database-level user isolation - queries filter by user_id at SQL level
- Scales to millions of vectors - O(log n) search complexity
- Production-ready for multi-tenant deployments
Requirements: PostgreSQL with pgvector extension installed (see Docker setup below)
`bashdatabase
OM_DB_PATH=./data/om.db # sqlite file path (default: ./data/openmemory.sqlite)
OM_DB_URL=sqlite://:memory: # or use in-memory db
$3
`typescript
import { Memory } from '@presidio-dev/agent-memory';const mem = new Memory('user-123'); // optional user_id
// add memories
await mem.add(
"user prefers dark mode",
{
tags: ["preference", "ui"],
created_at: Date.now()
}
);
// search
const results = await mem.search("user settings", {
user_id: "user-123",
limit: 10,
sectors: ["semantic", "procedural"]
});
// get by id
const memory = await mem.get("uuid-here");
// wipe all data (useful for testing)
await mem.wipe();
`---
performance tiers
-
fast - synthetic embeddings (no api calls), instant
- smart - hybrid semantic + synthetic for balanced speed/accuracy
- deep - pure semantic embeddings for maximum accuracy
- hybrid - adaptive based on query complexity---
mcp server
agent-memory includes an mcp server for integration with claude desktop, cursor, windsurf, and other mcp clients:
`bash
npx @presidio-dev/agent-memory serve --port 3000
`$3
`json
{
"mcpServers": {
"agent-memory": {
"command": "npx",
"args": ["@presidio-dev/agent-memory", "serve"]
}
}
}
`available mcp tools:
-
openmemory_query - search memories
- openmemory_store - add new memories
- openmemory_list - list all memories
- openmemory_get - get memory by id
- openmemory_reinforce - reinforce a memory---
examples
`typescript
// multi-user support
const mem = new Memory();
await mem.add("alice likes python", { user_id: "alice" });
await mem.add("bob likes rust", { user_id: "bob" });const alicePrefs = await mem.search("what does alice like?", { user_id: "alice" });
// returns python results only
// temporal filtering
const recent = await mem.search("user activity", {
startTime: Date.now() - 86400000, // last 24 hours
endTime: Date.now()
});
// sector-specific queries
const facts = await mem.search("company info", { sectors: ["semantic"] });
const howtos = await mem.search("deployment", { sectors: ["procedural"] });
`---
api reference
$3
create a new memory instance with optional default user_id.
$3
store a new memory.
parameters:
-
content - text content to store
- metadata - optional metadata object:
- user_id - user identifier
- tags - array of tag strings
- created_at - timestamp
- any other custom fieldsreturns: memory object with
id, primary_sector, sectors$3
search for relevant memories.
parameters:
-
query - search text
- options:
- user_id - filter by user
- limit - max results (default: 10)
- sectors - array of sectors to search
- startTime - filter memories after this timestamp
- endTime - filter memories before this timestampreturns: array of memory results with
id, content, score, sectors, salience, tags, meta`retrieve a memory by id.
ā ļø danger: delete all memories, vectors, and waypoints. useful for testing.
---
apache 2.0
---