Multi-tier caching with TTL support, cache invalidation, and multiple storage backends
npm install @bernierllc/cache-managerMulti-tier caching with TTL support, cache invalidation, and multiple storage backends.
``bash`
npm install @bernierllc/cache-manager
For Redis support:
`bash`
npm install @bernierllc/cache-manager redis
`typescript
import { CacheManager } from '@bernierllc/cache-manager';
// Create cache with default memory backend
const cache = new CacheManager({
strategy: 'lru',
maxSize: 1000,
defaultTtl: 60 * 1000 // 1 minute
});
// Set cache value
await cache.set('user:123', { name: 'John', email: 'john@example.com' });
// Get cache value
const user = await cache.get('user:123');
console.log(user); // { name: 'John', email: 'john@example.com' }
// Get or set pattern
const userData = await cache.getOrSet('user:456', async () => {
return await fetchUserFromDatabase('456');
}, 5 60 1000); // Cache for 5 minutes
`
- Multiple Backends: Memory, Redis, database, multi-tier
- Cache Strategies: LRU, LFU, TTL-based eviction policies
- Intelligent Invalidation: Tag-based, pattern-based cache invalidation
- Serialization: JSON, binary, custom serializers
- Compression: Optional compression for large values
- Stats & Monitoring: Hit rates, memory usage, performance metrics
- Distributed Support: Redis-backed distributed caching
#### Constructor
`typescript`
const cache = new CacheManager(options: CacheOptions)
Options:
- backend?: CacheBackend | CacheBackend[] - Storage backend(s)strategy?: 'lru' | 'lfu' | 'ttl'
- - Eviction strategy (default: 'lru')maxSize?: number
- - Maximum cache size (default: 1000)defaultTtl?: number
- - Default TTL in millisecondskeyPrefix?: string
- - Prefix for all cache keysonEviction?: (key, value) => void
- - Eviction callbackonExpiration?: (key, value) => void
- - Expiration callback
#### Core Methods
`typescript
// Basic operations
await cache.set(key: string, value: any, ttl?: number, tags?: string[]): Promise
await cache.get
await cache.delete(key: string): Promise
await cache.clear(): Promise
await cache.has(key: string): Promise
// Batch operations
await cache.mget
await cache.mset
await cache.getOrSet
// Invalidation
await cache.invalidateByTag(tag: string): Promise
await cache.invalidateByPattern(pattern: string): Promise
// Utilities
await cache.keys(pattern?: string): Promise
await cache.size(): Promise
await cache.getStats(): Promise
await cache.cleanup(): Promise
`
`typescript
import { CacheManager, MemoryCacheBackend } from '@bernierllc/cache-manager';
const cache = new CacheManager({
backend: new MemoryCacheBackend({
maxSize: 1000,
strategy: 'lru'
}),
defaultTtl: 60 60 1000 // 1 hour
});
// Cache user data
await cache.set('user:123', {
name: 'John Doe',
email: 'john@example.com'
});
const user = await cache.get('user:123');
console.log(user.name); // 'John Doe'
`
`typescript
import { CacheManager, RedisCacheBackend } from '@bernierllc/cache-manager';
const cache = new CacheManager({
backend: new RedisCacheBackend({
host: 'localhost',
port: 6379,
keyPrefix: 'myapp:'
}),
defaultTtl: 15 60 1000 // 15 minutes
});
// Works across multiple application instances
await cache.set('global:config', configObject);
`
`typescript
import {
CacheManager,
MemoryCacheBackend,
RedisCacheBackend
} from '@bernierllc/cache-manager';
const cache = new CacheManager({
backend: [
new MemoryCacheBackend({ maxSize: 100 }), // L1 cache - fast, small
new RedisCacheBackend({ host: 'localhost' }) // L2 cache - shared, persistent
]
});
// Automatically checks L1, then L2, updates L1 on L2 hits
const data = await cache.get('expensive:computation');
`
`typescript
const cache = new CacheManager({
backend: new RedisCacheBackend({ host: 'localhost' })
});
// Cache with tags
await cache.set('post:1', postData, 60 60 1000, ['user:123', 'category:tech']);
await cache.set('post:2', postData2, 60 60 1000, ['user:123', 'category:news']);
// Invalidate all posts by user
await cache.invalidateByTag('user:123');
// Invalidate all tech posts
await cache.invalidateByTag('category:tech');
`
`typescript
// Cache user-specific data
await cache.set('user:123:profile', profileData);
await cache.set('user:123:settings', settingsData);
await cache.set('user:456:profile', otherProfileData);
// Invalidate all data for user 123
await cache.invalidateByPattern('user:123:*');
`
`typescriptuser:${user.id}
// Warm cache with frequently accessed data
async function warmCache() {
const popularUsers = await getPopularUsers();
for (const user of popularUsers) {
await cache.set(, user, 60 60 1000);
}
}
// Background cache refresh
setInterval(async () => {
const keys = await cache.keys('user:*');
for (const key of keys) {
const userId = key.split(':')[1];
const freshData = await fetchUserFromDatabase(userId);
await cache.set(key, freshData, 60 60 1000);
}
}, 30 60 1000); // Refresh every 30 minutes
`
`typescriptCache Performance:
// Monitor cache performance
setInterval(async () => {
const stats = await cache.getStats();
console.log(
Hit Rate: ${(stats.hitRate * 100).toFixed(1)}%
Size: ${stats.size} entries
Memory: ${Math.round(stats.memoryUsage / 1024)}KB
Evictions: ${stats.evictions}
);
}, 60000);
// Automatic cleanup
cache.startCleanupInterval(5 60 1000); // Clean every 5 minutes
`
#### MemoryCacheBackend
`typescript`
new MemoryCacheBackend({
maxSize: 1000, // Maximum number of entries
strategy: 'lru', // Eviction strategy
onEviction: (key, value) => console.log('Evicted:', key),
onExpiration: (key, value) => console.log('Expired:', key)
})
#### RedisCacheBackend
`typescript`
new RedisCacheBackend({
host: 'localhost',
port: 6379,
password: 'secret', // Optional
db: 0, // Database number
keyPrefix: 'cache:', // Key prefix
connectionString: 'redis://localhost:6379' // Alternative to host/port
})
#### MultiTierCacheBackend
`typescript`
new MultiTierCacheBackend({
backends: [
new MemoryCacheBackend({ maxSize: 100 }),
new RedisCacheBackend({ host: 'localhost' })
]
})
- LRU (Least Recently Used): Evicts the least recently accessed entries
- LFU (Least Frequently Used): Evicts the least frequently accessed entries
- TTL (Time To Live): Evicts entries based on expiration time
`typescript
import { JSONSerializer, BinarySerializer } from '@bernierllc/cache-manager';
// JSON serialization (default)
const cache = new CacheManager({
serializer: new JSONSerializer()
});
// Binary serialization for better performance
const cache = new CacheManager({
serializer: new BinarySerializer()
});
`
`typescript`
try {
await cache.set('key', 'value');
const value = await cache.get('key');
} catch (error) {
console.error('Cache operation failed:', error);
// Fallback to original data source
}
1. Choose appropriate TTL: Set TTL based on data freshness requirements
2. Use appropriate cache size: Balance memory usage with hit rates
3. Implement cache warming: Preload frequently accessed data
4. Monitor performance: Track hit rates and adjust configuration
5. Handle failures gracefully: Always have fallback mechanisms
6. Use tags wisely: Group related cache entries for efficient invalidation
The cache manager is designed for high-performance scenarios:
- Memory backend: >50,000 operations/second
- Redis backend: >10,000 operations/second
- Multi-tier: Combines speed of memory with persistence of Redis
- Efficient serialization: Minimal overhead for data conversion
- Smart eviction: Algorithms optimized for real-world usage patterns
The cache manager supports optional logger integration using @bernierllc/logger:
`typescript
import { CacheManager } from '@bernierllc/cache-manager';
import { detectLogger } from '@bernierllc/logger';
const cache = new CacheManager({
strategy: 'lru',
maxSize: 1000
});
// Auto-detect logger if available
const logger = await detectLogger();
if (logger) {
// Enhanced logging for cache operations
cache.on('hit', (key) => {
logger.debug('Cache hit', { key });
});
cache.on('miss', (key) => {
logger.debug('Cache miss', { key });
});
cache.on('evicted', (key, reason) => {
logger.info('Cache eviction', { key, reason });
});
}
`
The cache manager integrates with NeverHub when available for enhanced service discovery and monitoring:
`typescript
import { CacheManager } from '@bernierllc/cache-manager';
import { detectNeverHub } from '@bernierllc/neverhub-adapter';
async function initializeCacheManager() {
const cache = new CacheManager({
strategy: 'lru',
maxSize: 1000
});
// Auto-detect NeverHub
const neverhub = await detectNeverHub();
if (neverhub) {
// Register cache manager as a service
await neverhub.register({
type: 'cache-manager',
name: '@bernierllc/cache-manager',
version: '1.0.0',
capabilities: [
{ type: 'cache', name: 'memory', version: '1.0.0' },
{ type: 'cache', name: 'redis', version: '1.0.0' }
]
});
// Publish cache events
cache.on('hit', async (key) => {
await neverhub.publishEvent({
type: 'cache.hit',
data: { key, timestamp: Date.now() }
});
});
// Subscribe to cache invalidation events
await neverhub.subscribe('cache.invalidate', async (event) => {
if (event.data.pattern) {
await cache.invalidatePattern(event.data.pattern);
} else if (event.data.key) {
await cache.delete(event.data.key);
}
});
}
return cache;
}
`
The cache manager implements graceful degradation patterns:
- Works without external services: Core functionality operates independently
- Logger integration: Enhanced monitoring when logger service is available
- NeverHub integration: Service discovery and events when NeverHub is present
- Backend flexibility: Falls back to memory storage if Redis is unavailable
- Error resilience: Cache failures don't break application functionality
- Required: None (core functionality)
- Optional: redis for Redis backend supportlz4
- Optional: for compression support@bernierllc/logger
- Optional: for enhanced logging capabilities@bernierllc/neverhub-adapter
- Optional: for service discovery integration@bernierllc/connection-parser` for connection string parsing
- Internal:
Copyright (c) 2025 Bernier LLC. All rights reserved.