A simple promise cache
npm install promise-cachexPromiseCacheX is a lightweight caching library designed to store and manage asynchronous promises and synchronous values efficiently. It eliminates redundant requests, prevents race conditions, and automatically cleans up expired cache entries.
---
``sh`
npm install promise-cachex
---
`typescript
import { PromiseCacheX } from 'promise-cachex';
const cache = new PromiseCacheX({ ttl: 5000, cleanupInterval: 2000 }); // 5s TTL, cleanup every 2s
async function fetchData() {
return new Promise((resolve) => setTimeout(() => resolve('cached data'), 1000));
}
(async () => {
const result1 = await cache.get('key1', fetchData, { ttl: 5000 });
console.log(result1); // 'cached data'
const result2 = await cache.get('key1', fetchData, { ttl: 5000 });
console.log(result2); // Returns cached value immediately
})();
// Supports caching synchronous values too
cache.get('key2', 'static value');
console.log(await cache.get('key2', 'static value')); // 'static value'
`
---
β
Promise-Aware β Stores and returns pending promises to avoid duplicate calls.
β
Supports Both Async and Sync Values β Cache promises, async functions, sync functions, or direct values.
β
TTL Expiry β Items automatically expire after a configurable time.
β
LRU Eviction β Bounded caches with least-recently-used eviction policy.
β
Automatic Cleanup β Removes expired entries at a regular interval.
β
Manual Deletion β Allows explicit cache clearing when needed.
β
Error Handling β Removes failed promises from the cache.
β
Efficient & Fast β Optimized for speed and low memory overhead.
---
For production use cases where memory must be bounded, use the maxEntries option to limit cache entries:
`typescript
const cache = new PromiseCacheX({
ttl: 60000,
maxEntries: 1000, // Maximum 1000 entries
});
// When cache reaches 1000 entries, least recently used items are evicted
for (let i = 0; i < 1500; i++) {
await cache.get(key${i}, value${i});`
}
console.log(cache.size()); // 1000
1. Access Tracking β Each get() call moves the item to the end of the internal Map (most recently used position)maxEntries
2. Eviction on Insert β When adding a new key would exceed , the least recently used item is evicted in O(1) time
3. Pending Promise Protection β Items with unresolved promises are never evicted to preserve promise coalescing
- TTL vs LRU: Items may be evicted before their TTL expires if the cache is at capacity
- Backward Compatible: If maxEntries is not set, the cache is unbounded (original behavior)maxEntries
- Temporary Overflow: If all items have pending promises, cache may temporarily exceed
`typescript
// Pending promises are protected from eviction
const cache = new PromiseCacheX({ maxEntries: 2 });
const slow = cache.get('slow', () => new Promise((r) => setTimeout(() => r('done'), 5000)));
await cache.get('key1', 'value1');
await cache.get('key2', 'value2'); // Evicts key1, not "slow"
console.log(cache.has('slow')); // true (protected while pending)
`
---
Creates a new instance of PromiseCacheX.
| Option | Type | Default | Description |
| ----------------- | -------- | -------------------- | ------------------------------------------------------- |
| ttl | number | 3600000 (1 hour) | Default TTL in milliseconds. 0 means no TTL. |cleanupInterval
| | number | 300000 (5 minutes) | Interval in milliseconds to remove expired items. |maxEntries
| | number | undefined | Max cache entries. When reached, LRU items are evicted. |
---
Retrieves a cached value or fetches and caches it if not available.
| Option | Type | Default | Description |
| ------ | -------- | --------- | ------------------------------------------ |
| ttl | number | Cache TTL | TTL for the cached item. 0 means no TTL. |
FetchOrPromise
- An async function returning a promise (() => Promise)() => T
- A synchronous function returning a value ()Promise
- A direct promise ()T
- A direct value ()
`typescript
// Caching an async function
const result = await cache.get('key1', async () => 'value', { ttl: 5000 });
// Caching a synchronous function
const syncResult = await cache.get('key2', () => 'sync value');
// Caching a direct promise
const promiseResult = await cache.get('key3', Promise.resolve('promised value'));
// Caching a direct value
const directResult = await cache.get('key4', 'direct value');
`
---
Sets a value in the cache.
`typescript`
cache.set('key1', 'value1', { ttl: 5000 });
---
Removes a specific key from the cache.
`typescript`
cache.delete('key1');
---
Clears all cached entries.
`typescript`
cache.clear();
---
Returns the number of cached items.
`typescript`
console.log(cache.size());
---
Returns an array of cached keys.
`typescript`
console.log(cache.keys());
---
Checks if a key exists in the cache.
`typescript`
console.log(cache.has('key1'));
---
Returns the maximum entries limit, or undefined if unbounded.
`typescript`
const cache = new PromiseCacheX({ maxEntries: 1000 });
console.log(cache.getMaxEntries()); // 1000
---
Returns true if the cache is at or over its maximum entries limit.
`typescript`
const cache = new PromiseCacheX({ maxEntries: 2 });
await cache.get('key1', 'value1');
console.log(cache.isAtCapacity()); // false
await cache.get('key2', 'value2');
console.log(cache.isAtCapacity()); // true
---
PromiseCacheX lets you choose between strict, single-type caches and a loose, multi-type cacheβand still allows per-call type parameters on get/set.
The class is generic: PromiseCacheX.
- If you omit T, the cache runs in loose mode (accepts mixed value types).T
- If you provide , the cache runs in strict mode (all values must conform to T).get()
- You can still provide a type argument to and set(), but it is constrained so that U must extend the cacheβs type.
Method signatures (simplified):
`ts
class PromiseCacheX
get(key: string, fetcherOrPromise: (() => Promise | U) | Promise | U, options?: { ttl?: number }): Promise;
set(key: string, value: U | Promise, options?: { ttl?: number }): void;
}
`
> Note: When T is omitted, the library treats it as βlooseβ so mixed types are allowed. When T is provided, U is constrained to that type.
---
When you donβt pass a generic, you can mix types freely. You may still annotate each call for clarity.
`ts
import { PromiseCacheX } from 'promise-cachex';
const loose = new PromiseCacheX(); // T omitted β loose mode
// Store different types
await loose.get
await loose.get
await loose.get<{ id: string }>('u1', Promise.resolve({ id: 'abc' }));
// All OK β loose mode accepts them
`
Use this for a shared utility cache with heterogeneous values.
---
Provide T to restrict the cache to a single type. Per-call generics on get/set must extend that type.
`ts
type User = { id: number; name: string };
const strict = new PromiseCacheX
// β
OK: value matches User
await strict.get
// β Error: string does not extend User
// await strict.get
// β
OK: promise of User`
strict.set('u:2', Promise.resolve({ id: 2, name: 'Ion' }));
This is ideal for domain caches (e.g., Users, Products) where consistency matters.
---
Because U extends T, you can narrow on a call when itβs safe:
`ts
type User = { id: number; name: string };
type MaybeUser = User | null;
const cache = new PromiseCacheX
// β
OK: User is a subtype of User | null
const u = await cache.get
// β
Also OK: storing null
await cache.get
// β Error: string not assignable to User | null`
// await cache.get
> Tip: Using unions like User | null lets you express cacheable absence while keeping strong typing.
---
- Loose mode (omit T): quick utility cache, heterogeneous values, prototyping.PromiseCacheX
- Strict mode (): domain caches with strong guarantees and easier refactors.
---
Here are the latest performance benchmarks for PromiseCacheX:
| Task | Latency Avg (ns) | Throughput Avg (ops/s) |
| ------------------------------------ | ---------------- | ---------------------- |
| Cache 1,000 Keys | 344,310 | 3,685 |
| Cache 10,000 Keys | 3,463,749 | 308 |
| Retrieve Cached Values (1,000 keys) | 343,742 | 3,699 |
| Retrieve Cached Values (10,000 keys) | 3,475,116 | 307 |
| Task | Latency Avg (ns) | Throughput Avg (ops/s) | Notes |
| ------------------------------------- | ---------------- | ---------------------- | --------------------- |
| LRU Eviction (10k inserts, max 1,000) | 10,032,003 | 102 | 9,000 evictions |
| LRU Eviction (10k inserts, max 100) | 6,137,702 | 171 | 9,900 evictions |
| LRU Cache Hits with Reordering (1k) | 551,000 (median) | 1,815 | 1,000 Map reorder ops |
> Note: Smaller maxEntries can be faster because _findLRUCandidate() returns the first resolved item in O(1) time. With fewer entries, there's less chance of pending promises blocking eviction.
---
?- π Prevents duplicate async requests (efficient shared promises)
- β‘ Fast and lightweight (optimized caching)
- π‘ Ensures memory efficiency (auto-expiring cache)
- π₯ Great for API calls, database queries, and computations
- π Supports both async and sync values (no need for multiple caching libraries)
---
MIT License.
π Try PromiseCacheX` today!