A tiny (367B) utility for batching and caching operations
npm install dldrA tiny utility for batching and caching operations
This is free to use software, but if you do like it, consisder supporting me ❤️


``shell`
npm add dldr
The default module will batch calls to your provided loadFn witin the current tick.
Under the hood we schedule a function with
queueMicrotask. That then calls your loadFn with
the unique keys that have been requested.
`ts
import { load } from 'dldr';
// ⬇️ define some arbitary load method that accepts a single argument array of keys
const getPosts = (keys: string[]) => db.execute('SELECT id, name FROM posts WHERE id IN (?)', [keys]);
// .. for convenience, you could bind
const loadPost = load.bind(null, getPosts);
// ⬇️ demo some collection that is built up over time.
const posts = [
load(getPosts, '123'),
loadPost('123'), // functionally equivalent to the above
load(getPosts, '456'),
];
// ...
posts.push(load(getPosts, '789'));
// ⬇️ batch the load calls, and wait for them to resolve
const loaded = await Promise.all(posts);
expect(getPosts).toHaveBeenCalledWith(['123', '456', '789']);
expect(loaded).toEqual([
{ id: '123', name: '123' },
{ id: '123', name: '123' },
{ id: '456', name: '456' },
{ id: '789', name: '789' },
]);
`
`ts
import { load } from 'dldr';
import { graphql, buildSchema } from 'graphql';
const schema = buildSchema(
type Query {
me(name: String!): String!
});
const operation = {
a: me(name: "John")
b: me(name: "Jane")
};
const results = await graphql({
schema,
source: operation,
contextValue: {
getUser: load.bind(null, async (names) => {
// Assume youre calling out to a db or something
const result = names.map((name) => name);
// lets pretend this is a promise
return Promise.resolve(result);
}),
},
rootValue: {
me: ({ name }, ctx) => {
return ctx.getUser(name);
},
},
});
`
Once a key has been loaded, it will be cached for all future calls.
`ts
import { load } from 'dldr/cache';
import { getPosts } from './example';
// operates the same as the above, but will cache the results of the load method
const cache = new Map();
const loadPost = load.bind(null, getPosts, cache);
// note; cache is optional, and will be created if not provided
const posts = Promise.all([
load(getPosts, cache, '123'),
loadPost('123'), // will be cached, and functionally equivalent to the above
loadPost('456'),
]);
expect(getPosts).toHaveBeenCalledTimes(1);
expect(getPosts).toHaveBeenCalledWith(['123', '456']);
expect(loaded).toEqual([
{ id: '123', name: '123' },
{ id: '123', name: '123' },
{ id: '456', name: '456' },
]);
// ⬇️ the cache will be used for subsequent calls
const post = await loadPost('123');
expect(getPosts).toHaveBeenCalledTimes(1); // still once
expect(post).toEqual({ id: '123', name: '123' });
`
#### Module: dldr
The main entry point to start batching your calls.
`ts`
function load
loadFn: (keys: string[]) => Promise<(T | Error)[]>,
key: string
): Promise
> Note Might be worth calling .bind if you dont want to pass your loader everywhere.`
>
> js`
> const userLoader = load.bind(null, getUsers);
>
> await userLoader('123');
>
#### Module: dldr/cache
A submodule that will cache the results of your loadFn between ticks.
`ts`
function load
loadFn: (keys: string[]) => Promise<(T | Error)[]>,
cache: MapLike
key: string,
): Promise
> A default Map based cache will be used if you dont provide one.
_Self managed cache_
We explicitly do not handle mutations, so if you wish to retrieve fresh entries, or have a primed cache we recommend you
do so yourself. All we require is a Map like object.
Commonly an LRU cache is used, we recommend tmp-cache.
`ts
import LRU from 'tmp-cache';
import { load } from 'dldr/cache';
const loadUser = load.bind(null, getUsers, new LRU(100));
`
> via the /bench directory with Node v18.16.1
```
✔ dldr ~ 910,576 ops/sec ± 1.34%
✔ dldr/cache ~ 636,467 ops/sec ± 4.47%
✔ dataloader ~ 245,602 ops/sec ± 1.34%
✔ dataloader/cache ~ 153,254 ops/sec ± 0.64%
MIT © Marais Rossouw