LRU Cache that is safe for clusters, based on `lru-cache`. Save memory by only caching items on the main thread via a promisified interface.
npm install lru-cache-for-clusters-as-promised
!Code Coverage Badge

!Downloads
LRU Cache for Clusters as Promised provides a cluster-safe lru-cache via Promises. For environments not using cluster, the class will provide a Promisified interface to a standard lru-cache.
Each time you call cluster.fork(), a new thread is spawned to run your application. When using a load balancer even if a user is assigned a particular IP and port these values are shared between the workers in your cluster, which means there is no guarantee that the user will use the same workers between requests. Caching the same objects in multiple threads is not an efficient use of memory.
LRU Cache for Clusters as Promised stores a single lru-cache on the master thread which is accessed by the workers via IPC messages. The same lru-cache is shared between workers having a common master, so no memory is wasted.
When creating a new instance and cluster.isMaster === true the shared cache is checked based on the and the shared cache is populated, it will be used instead but acted on locally rather than via IPC messages. If the shared cache is not populated a new LRUCache instance is returned.
shell
npm install --save lru-cache-for-clusters-as-promised
``shell
yarn add lru-cache-for-clusters-as-promised
`options
*
namespace: string, default "default";
* The namespace for this cache on the master thread as it is not aware of the worker instances.
* timeout: integer, default 100.
* The amount of time in milliseconds that a worker will wait for a response from the master before rejecting the Promise.
* failsafe: string, default resolve.
* When a request times out the Promise will return resolve(undefined) by default, or with a value of reject the return will be reject(Error).
* max: number
* The maximum items that can be stored in the cache
* maxAge: milliseconds
* The maximum age for an item to be considered valid
* stale: true|false
* When true expired items are return before they are removed rather than undefined
* prune: false|crontime string, defaults to false
Use a cron job on the master thread to call prune() on your cache at regular intervals specified in "crontime", for example "/30 *" would prune the cache every 30 seconds (See node-cron patterns for more info). Also works in single threaded environments not using the cluster module. Passing false to an existing namespace will disable any jobs that are scheduled.
* parse: function, defaults to JSON.parse
* Pass in a custom parser function to use for deserializing data sent to/from the cache. This is set on the LRUCacheForClustersAsPromised instance and in theory could be different per worker.
* stringify: function, defaults to JSON.stringify
* Pass in a custom stringifier function to for creating a serializing data sent to/from the cache.> ! note that
length and dispose are missing as it is not possible to pass functions via IPC messages.api
static functions
*
init(): void
* Should be called when cluster.isMaster === true to initialize the caches.
* getInstance(options): Promise
* Asynchronously returns an LRUCacheForClustersAsPromised instance once the underlying LRUCache is guaranteed to exist. Uses the same options you would pass to the constructor. When constructed synchronously other methods will ensure the underlying cache is created, but this method can be useful from the worker when you plan to interact with the caches directly. Note that this will slow down the construction time on the worker by a few milliseconds while the cache creation is confirmed.
* getAllCaches(): { key : LRUCache }
* Synchronously returns a dictionary of the underlying LRUCache caches keyed by namespace. Accessible only when cluster.isMaster === true, otherwise throws an exception.instance functions
*
getCache(): LRUCache
* Gets the underlying LRUCache. Accessible only when cluster.isMaster === true, otherwise throws an exception.
* set(key, value, maxAge): Promise
* Sets a value for a key. Specifying the maxAge will cause the value to expire per the stale value or when pruned.
* setObject async (key, object, maxAge): Promise
* Sets a cache value where the value is an object. Passes the values through cache.stringify(), which defaults to JSON.stringify(). Use a custom parser like flatted to cases like circular object references.
* mSet({ key1: 1, key2: 2, ...}, maxAge): Promise
* Sets multiple key-value pairs in the cache at one time.
* mSetObjects({ key1: { obj: 1 }, key2: { obj: 2 }, ...}, maxAge): Promise
* Sets multiple key-value pairs in the cache at one time, where the value is an object. Passes the values through cache.stringify(), see cache.setObject();
* get(key): Promise
* Returns a value for a key.
* getObject(key): Promise