High-concurrency cache coalescing and state management library.
npm install @sabschyks/arcaHigh-Concurrency Cache Coalescing & State Management for Node.js
Prevent cache stampedes, eliminate duplicated fetches, and keep your APIs fast under extreme load.


---
In high-scale systems, when a popular cache key expires, hundreds or thousands of concurrent requests may hit your database at the same time before the cache is repopulated.
This phenomenon is known as the Cache Stampede or Thundering Herd problem.
Most traditional caching libraries (e.g. simple Redis wrappers):
- Only store values
- Do not coordinate concurrent requests
- Do not protect your database under high contention
---
Arca is more than a cache client β itβs a concurrency shield for Node.js applications.
1. Request Coalescing (Singleflight)
If 1,000 requests ask for the same key simultaneously, Arca executes the fetcher only once.
All requests await the same Promise.
2. Stale-While-Revalidate (SWR)
Serve stale data instantly (near-zero latency) while refreshing the cache in the background.
3. Adapter-Agnostic Storage
Works out of the box with:
- In-Memory storage (default)
- Redis (for distributed systems)
---
``bashUsing pnpm (recommended)
pnpm add @sabschyks/arca
---
β‘ Quick Start
`ts
import { Arca } from '@sabschyks/arca';// 1. Initialize Arca (defaults to in-memory storage)
const arca = new Arca({ defaultTtl: 60_000 }); // 1 minute
async function getUserProfile(userId: string) {
// 2. Wrap your expensive operation
return arca.get(
user:${userId}, async () => {
console.log('Fetching from database...');
return db.query('SELECT * FROM users WHERE id = ?', [userId]);
});
}// 3. Simulate concurrent traffic
Promise.all([
getUserProfile('123'),
getUserProfile('123'),
getUserProfile('123'),
]);
// β
Result:
// The database is hit only once.
// All requests resolve with the same data.
`---
π Stale-While-Revalidate (SWR) Explained
Arca implements SWR to keep your application fast even when cached data expires.
Example timeline:
1. Time 0s
Data is cached with a TTL of 60s.
2. Time 61s
A request arrives. The cache entry is expired.
3. Arca behavior
* Immediately returns the stale value (latency β 0ms)
* Triggers a background refresh (singleflight-protected)
4. Next request
* Receives the fresh data
This guarantees:
* Low latency
* No traffic spikes
* No duplicated fetches
---
π Redis Adapter (Production Ready)
For distributed environments such as Kubernetes, Serverless, or multi-instance APIs, Arca supports Redis.
$3
`bash
pnpm add ioredis
`$3
`ts
import { Arca, RedisAdapter } from 'arca';const arca = new Arca({
storage: new RedisAdapter('redis://localhost:6379'),
defaultTtl: 1000 60 5, // 5 minutes
});
`---
π API Reference
$3
| Option | Type | Description |
| ------------ | ---------------- | -------------------------------------------- |
|
storage | StorageAdapter | Cache backend (default: MemoryAdapter). |
| defaultTtl | number | Default TTL in milliseconds (default: 60000). |---
$3
Retrieve a value from cache or compute it safely under concurrency.
Parameters:
*
key: string
Unique cache identifier.*
fetcher: () => Promise
Function executed when the value is missing or stale.*
options?: *
ttl?: number β Override TTL for this key.
* forceRefresh?: boolean β Bypass cache and fetch fresh data.---
$3
Manually invalidate a cache entry.
`ts
arca.delete('user:123');
``---
Arca shines when:
* You have high-traffic endpoints.
* Requests often target the same resources.
* Cache expiration causes database spikes.
* You want zero-config protection against stampedes.