โก DeepBase JSON - filesystem driver
npm install deepbase-jsonThe ultimate multi-driver persistence system for Node.js
DeepBase is a powerful, flexible database abstraction that lets you use multiple storage backends with a single, intuitive API. Write once, persist everywhere.
- ๐ Driver-based architecture: Plug and play different storage backends
- ๐ Multi-driver support: Use multiple backends simultaneously with priority fallback
- ๐ฆ Modular packages: Install only what you need
- ๐ Built-in migration: Easy data migration between drivers
- ๐ก๏ธ Automatic fallback: System continues working even if primary driver fails
- ๐ Cross-platform: Works on Node.js, Bun, Deno (with appropriate drivers)
- ๐ Concurrency-safe: Race condition protection for all concurrent operations
- โฑ๏ธ Timeout support: Configurable timeouts to prevent hanging operations
DeepBase v3.0 is split into modular packages:
- deepbase - Core library (includes deepbase-json as dependency)
- deepbase-json - JSON filesystem driver (no external DB dependencies!)
- deepbase-sqlite - SQLite driver (embedded database, ACID compliant)
- deepbase-mongodb - MongoDB driver
- deepbase-redis - Redis driver (vanilla, works with any Redis)
- deepbase-redis-json - Redis Stack driver (requires RedisJSON module)
``bash`
npm install deepbasedeepbase automatically includes deepbase-json
`javascript
import DeepBase from 'deepbase';
// Option 1: Backward-compatible syntax (uses JSON driver by default)
const db = new DeepBase({ path: './data', name: 'mydb' });
await db.connect();
// Option 2: Explicit JSON driver
import { JsonDriver } from 'deepbase';
const db = new DeepBase(new JsonDriver({ path: './data', name: 'mydb' }));
await db.connect();
await db.set('users', 'alice', { name: 'Alice', age: 30 });
const alice = await db.get('users', 'alice');
console.log(alice); // { name: 'Alice', age: 30 }
`
`bash`
npm install deepbase deepbase-mongodb
`javascript
import DeepBase, { JsonDriver } from 'deepbase';
import MongoDriver from 'deepbase-mongodb';
const db = new DeepBase([
new MongoDriver({ url: 'mongodb://localhost:27017' }),
new JsonDriver({ path: './backup' })
], {
writeAll: true, // Write to all drivers
readFirst: true, // Read from first available
failOnPrimaryError: false // Continue if primary fails
});
await db.connect();
// Writes to both MongoDB and JSON
await db.set('config', 'version', '1.0.0');
// Reads from MongoDB (or JSON if MongoDB is down)
const version = await db.get('config', 'version');
`
`javascript
await db.set('config', 'theme', 'dark');
await db.set('config', 'lang', 'en');
const theme = await db.get('config', 'theme'); // 'dark'
const config = await db.get('config'); // { theme: 'dark', lang: 'en' }
`
`javascript
const userPath = await db.add('users', { name: 'Bob', email: 'bob@example.com' });
// userPath: ['users', 'aB3xK9mL2n']
const user = await db.get(...userPath);
// { name: 'Bob', email: 'bob@example.com' }
`
`javascript`
await db.set('stats', 'views', 100);
await db.inc('stats', 'views', 50); // 150
await db.dec('stats', 'views', 30); // 120
`javascript`
await db.set('user', 'name', 'alice');
await db.upd('user', 'name', name => name.toUpperCase());
const name = await db.get('user', 'name'); // 'ALICE'
`javascript
await db.set('products', 'laptop', { price: 999 });
await db.set('products', 'mouse', { price: 29 });
const keys = await db.keys('products'); // ['laptop', 'mouse']
const values = await db.values('products'); // [{ price: 999 }, { price: 29 }]
const entries = await db.entries('products'); // [['laptop', {...}], ['mouse', {...}]]
`
One of the most powerful features is built-in data migration:
`javascript
import DeepBase from '@deepbase/core';
import JsonDriver from '@deepbase/json';
import MongoDriver from '@deepbase/mongodb';
// Setup with both drivers
const db = new DeepBase([
new JsonDriver({ path: './data', name: 'mydb' }), // Source (index 0)
new MongoDriver({ url: 'mongodb://localhost:27017' }) // Target (index 1)
]);
await db.connect();
// Migrate all data from JSON (0) to MongoDB (1)
const result = await db.migrate(0, 1, {
clear: true, // Clear target before migration
batchSize: 100, // Progress callback every 100 items
onProgress: (progress) => {
console.log(Migrated ${progress.migrated} items);
}
});
console.log(Migration complete: ${result.migrated} items, ${result.errors} errors);`
`javascript`
// Copy data from primary (index 0) to all other drivers
await db.syncAll();
For maximum reliability, use multiple backends with priority:
`javascript
import DeepBase from '@deepbase/core';
import MongoDriver from '@deepbase/mongodb';
import JsonDriver from '@deepbase/json';
import RedisDriver from '@deepbase/redis';
const db = new DeepBase([
new MongoDriver({ url: 'mongodb://localhost:27017' }), // Primary
new JsonDriver({ path: './persistence' }), // Backup
new RedisDriver({ url: 'redis://localhost:6379' }) // Cache
], {
writeAll: true, // Replicate writes to all three
readFirst: true, // Read from first available
failOnPrimaryError: false // Graceful degradation
});
await db.connect();
// Writes to all three backends
await db.set('users', 'john', { name: 'John' });
// If MongoDB fails, reads from JSON
// If both fail, reads from Redis
const user = await db.get('users', 'john');
`
Benefits:
- โ
Automatic failover if any backend goes down
- โ
Data replication across all backends
- โ
Zero downtime during migrations
- โ
Easy recovery from failures
`javascript`
new DeepBase(drivers, options)
Parameters:
- drivers: Single driver or array of drivers (in priority order)options
- :writeAll
- (default: true): Write to all driversreadFirst
- (default: true): Read from first available driverfailOnPrimaryError
- (default: true): Throw if primary driver failslazyConnect
- (default: true): Auto-connect on first operationtimeout
- (default: 0): Global timeout in ms (0 = disabled)readTimeout
- (default: timeout): Timeout for read operations in mswriteTimeout
- (default: timeout): Timeout for write operations in msconnectTimeout
- (default: timeout): Timeout for connection in ms
- await db.connect() - Connect all driversawait db.disconnect()
- - Disconnect all driversawait db.get(...path)
- - Get value at pathawait db.set(...path, value)
- - Set value at pathawait db.del(...path)
- - Delete value at pathawait db.inc(...path, amount)
- - Increment numeric valueawait db.dec(...path, amount)
- - Decrement numeric valueawait db.add(...path, value)
- - Add item with auto-generated IDawait db.upd(...path, fn)
- - Update value with functionawait db.keys(...path)
- - Get keys at pathawait db.values(...path)
- - Get values at pathawait db.entries(...path)
- - Get entries at path
- await db.migrate(fromIndex, toIndex, options) - Migrate data between driversawait db.syncAll(options)
- - Sync primary to all other driversdb.getDriver(index)
- - Get driver by indexdb.getDrivers()
- - Get all drivers
DeepBase v3.0+ provides built-in race condition protection for all drivers:
/ dec() - Atomic increment/decrement
- โ
upd() - Atomic read-modify-write
- โ
set() - Safe concurrent writes
- โ
add() - Unique ID generation without collisions$3
SQLite Driver: Uses native SQLite transactions for atomic operations
`javascript
// 100 concurrent increments = exactly 100 (no race conditions)
await Promise.all(
Array.from({ length: 100 }, () => db.inc('counter', 1))
);
`JSON Driver: Uses operation queue to serialize writes
`javascript
// Concurrent updates are safe - no data loss
await Promise.all([
db.upd('account', acc => ({ ...acc, balance: acc.balance + 50 })),
db.upd('account', acc => ({ ...acc, lastAccess: Date.now() }))
]);
`examples/08-concurrency-safe.js for detailed examples.โฑ๏ธ Timeout Configuration
Prevent operations from hanging indefinitely with configurable timeouts:
`javascript
import DeepBase, { JsonDriver } from 'deepbase';// Global timeout for all operations
const db = new DeepBase(new JsonDriver(), {
timeout: 5000 // 5 seconds for all operations
});
// Different timeouts for reads and writes
const db2 = new DeepBase([
new RedisDriver({ url: 'redis://slow-server:6379' }),
new JsonDriver({ path: './backup' }) // Fallback if Redis times out
], {
readTimeout: 2000, // 2 seconds for reads (get, keys, values, entries)
writeTimeout: 5000, // 5 seconds for writes (set, del, inc, dec, add, upd)
connectTimeout: 10000 // 10 seconds for connection
});
try {
const value = await db.get('some', 'key');
} catch (error) {
// Error: get() timed out after 2000ms
console.error(error.message);
}
`Timeout Options:
-
timeout (default: 0): Global timeout in milliseconds for all operations (0 = disabled)
- readTimeout (default: timeout): Timeout for read operations
- writeTimeout (default: timeout): Timeout for write operations
- connectTimeout (default: timeout): Timeout for connection operationUse Cases:
- ๐ก๏ธ Network issues: Prevent hanging on slow/unresponsive database servers
- ๐ Fast failover: Combined with multi-driver setup for automatic fallback
- โก Performance SLAs: Enforce response time requirements
- ๐ Debugging: Identify slow operations during development
examples/09-timeout.js for examples and TIMEOUT_FEATURE.md for detailed documentation.๐ฏ Available Drivers
$3
Filesystem-based JSON storage. Perfect for:
- Development and testing
- Small to medium datasets
- Human-readable data
- No external dependencies
`javascript
new JsonDriver({
path: './data', // Storage directory
name: 'mydb', // Filename (mydb.json)
stringify: JSON.stringify, // Custom serializer
parse: JSON.parse // Custom parser
})
`$3
SQLite embedded database. Perfect for:
- Production applications
- Medium to large datasets
- Offline-first apps
- Desktop applications (Electron/Tauri)
- Serverless deployments
- ACID compliance required
`javascript
new SqliteDriver({
path: './data', // Storage directory
name: 'mydb' // Database filename (mydb.db)
})
`No external dependencies required - embedded database!
$3
MongoDB storage. Perfect for:
- Production applications
- Large datasets
- Complex queries
- Scalability
`javascript
new MongoDriver({
url: 'mongodb://localhost:27017',
database: 'myapp', // Database name
collection: 'documents' // Collection name
})
`Requires MongoDB:
`bash
docker run -d -p 27017:27017 mongodb/mongodb-community-server:latest
`$3
Vanilla Redis storage (no modules required). Perfect for:
- Caching
- Session storage
- High-performance reads/writes
- Works with any Redis installation
`javascript
new RedisDriver({
url: 'redis://localhost:6379',
prefix: 'myapp' // Key prefix
})
`Requires standard Redis:
`bash
docker run -d -p 6379:6379 redis:latest
`Note: Uses JSON serialization. For atomic JSON operations, use
deepbase-redis-json instead.$3
Redis Stack storage with RedisJSON module. Perfect for:
- Caching with large nested objects
- High-performance reads/writes
- Atomic JSON path operations
- Real-time applications
`javascript
import RedisDriver from 'deepbase-redis-json';new RedisDriver({
url: 'redis://localhost:6379',
prefix: 'myapp' // Key prefix
})
`Requires Redis Stack (includes RedisJSON):
`bash
docker run -d -p 6379:6379 redis/redis-stack-server:latest
`Benefits over vanilla Redis driver:
- Atomic JSON path operations
- More efficient for partial updates
- Native JSON.NUMINCRBY for atomic increments
๐งช Custom JSON Serialization
DeepBase supports custom JSON serialization in the JSON driver, allowing for circular references and complex data structures.
$3
`javascript
import { parse, stringify } from 'flatted';
import DeepBase, { JsonDriver } from 'deepbase';const db = new DeepBase(new JsonDriver({
path: './data',
name: 'mydb',
stringify,
parse
}));
await db.connect();
// Now you can store circular references
const obj = { name: 'circular' };
obj.self = obj; // circular reference
await db.set('circular', obj);
`$3
`javascript
const CircularJSON = require('circular-json');
import DeepBase, { JsonDriver } from 'deepbase';const db = new DeepBase(new JsonDriver({
path: './data',
name: 'mydb',
stringify: (obj) => CircularJSON.stringify(obj, null, 4),
parse: CircularJSON.parse
}));
await db.connect();
await db.set("a", "b", { circular: {} });
await db.set("a", "b", "circular", "self", await db.get("a", "b"));
`๐ Secure Storage with Encryption
You can create encrypted storage by extending DeepBase with custom serialization:
`javascript
import CryptoJS from 'crypto-js';
import DeepBase, { JsonDriver } from 'deepbase';class DeepbaseSecure extends DeepBase {
constructor(opts) {
const encryptionKey = opts.encryptionKey;
delete opts.encryptionKey;
// Create JSON driver with encryption
const driver = new JsonDriver({
...opts,
stringify: (obj) => {
const iv = CryptoJS.lib.WordArray.random(128 / 8);
const encrypted = CryptoJS.AES.encrypt(
JSON.stringify(obj),
encryptionKey,
{ iv }
);
return iv.toString(CryptoJS.enc.Hex) + ':' + encrypted.toString();
},
parse: (encryptedData) => {
const [ivHex, encrypted] = encryptedData.split(':');
const iv = CryptoJS.enc.Hex.parse(ivHex);
const bytes = CryptoJS.AES.decrypt(encrypted, encryptionKey, { iv });
return JSON.parse(bytes.toString(CryptoJS.enc.Utf8));
}
});
super(driver);
}
}
// Create an encrypted database
const secureDB = new DeepbaseSecure({
path: './data',
name: 'secure_db',
encryptionKey: 'your-secret-key-here'
});
await secureDB.connect();
// Use it like a regular DeepBase instance
await secureDB.set("users", "admin", { password: "secret123" });
const admin = await secureDB.get("users", "admin");
console.log(admin); // { password: 'secret123' }
// But the file on disk is encrypted!
`๐ ๏ธ Creating Custom Drivers
Extend
DeepBaseDriver to create your own drivers:`javascript
import { DeepBaseDriver } from '@deepbase/core';class MyCustomDriver extends DeepBaseDriver {
async connect() { / ... / }
async disconnect() { / ... / }
async get(...args) { / ... / }
async set(...args) { / ... / }
async del(...args) { / ... / }
async inc(...args) { / ... / }
async dec(...args) { / ... / }
async add(...args) { / ... / }
async upd(...args) { / ... / }
}
`๐ Examples
Check the
/examples` folder for complete examples:1. Simple JSON - Basic single-driver usage
2. Multi-Driver - MongoDB with JSON backup
3. Migration - Moving data from JSON to MongoDB
4. Three-Tier - Full production-ready setup
- โก Simple API: Intuitive nested object operations
- ๐ Flexible: Use any storage backend
- ๐ก๏ธ Resilient: Automatic failover and recovery
- ๐ฆ Modular: Install only what you need
- ๐ Fast: Optimized for performance
- ๐ Universal: Works across platforms
- ๐ช Production-ready: Battle-tested patterns
Contributions are welcome! Whether it's:
- ๐ Bug reports
- ๐ก Feature requests
- ๐ Documentation improvements
- ๐ New drivers
MIT License - Copyright (c) Martin Clasen
---
๐ Try DeepBase today and simplify your data persistence!
DeepBase v3.0 delivers exceptional performance:
- โก Redis: 6,000-7,700 ops/sec for most operations
- ๐ JSON: 600,000+ ops/sec for cached reads
- ๐ MongoDB: 1,600-2,900 ops/sec balanced performance
See Benchmark Results for detailed performance analysis.
For more information, visit GitHub