Zero-friction onboarding CLI for RecallBricks - AI memory infrastructure
npm install @recallbricks/agent-cliZero-friction onboarding CLI for RecallBricks - AI memory infrastructure.
Get your agent running with memory in 30 seconds:
``bash`
npx @recallbricks/agent-cli setup rb_live_your_key_here
RecallBricks provides AI memory infrastructure for your agents. This CLI connects your existing agents to persistent memory with zero configuration.
1. Go to dashboard.recallbricks.com/setup
2. Choose your agent type (sales assistant, support agent, etc.)
3. Accept or customize constitutional memories
4. Copy your API key
`bash`
npx @recallbricks/agent-cli setup rb_live_xxxxx
The CLI will:
- Validate your API key
- Detect your project language (TypeScript/JavaScript/Python)
- Ask for your LLM API key (saved for future projects)
- Generate all necessary files
- Install dependencies
- Offer instant testing
`bashTypeScript
npx ts-node agent.ts
Commands
| Command | Description |
|---------|-------------|
|
setup | Set up a new agent from dashboard config |
| test | Test your agent interactively |
| config | Manage saved settings and credentials |
| update | Update agent to latest SDK version |$3
`bash
recallbricks setup rb_live_xxxxx [options]Options:
-d, --directory Target directory (default: current)
-l, --language Force language: typescript, javascript, python
--no-install Skip dependency installation
--no-test Skip test prompt
`$3
`bash
recallbricks test [options]Options:
-c, --config Path to recallbricks-config.json
`$3
`bash
recallbricks config [options]Options:
--show View saved configuration
--clear Clear all saved credentials
--set-llm Set default LLM (anthropic, openai, google)
`$3
`bash
recallbricks update [options]Options:
-d, --directory Agent directory
--dry-run Show changes without applying
`Generated Files
After setup, you'll have:
`
my-agent/
├── agent.ts # Main agent with memory integration
├── package.json # Dependencies
├── tsconfig.json # TypeScript config
├── .env # API keys (gitignored)
├── recallbricks-config.json
└── examples/
├── 01-simple-chat.ts
└── 02-memory-usage.ts
`Supported LLMs
- Anthropic (Claude) - Default
- OpenAI (GPT-4)
- Google (Gemini)
The CLI auto-detects which LLM you configured in the dashboard and generates the appropriate code.
Example Code
`typescript
import { chat, rb } from './agent';// Chat with memory context
const response = await chat("What did we discuss last time?");
// Direct memory operations
await rb.save({
text: "User prefers dark mode",
tags: ['preference', 'ui']
});
const memories = await rb.recall({
query: "user preferences",
limit: 5
});
`Documentation
- Quickstart Guide
- Commands Reference
- Examples
- Troubleshooting
Development
`bash
Install dependencies
npm installRun in development
npm run dev -- setup rb_test_xxxBuild
npm run buildRun built version
npm start -- setup rb_test_xxx
``- RecallBricks Dashboard
- Documentation
- API Reference
MIT