A batch manager that will deduplicate and batch requests for a certain data type made within a window.
npm install @yornaath/batshitbash
yarn add @yornaath/batshit
`
Quickstart
Here we are creating a simple batcher that will batch all fetches made within a window of 10 ms into one request.
`ts
import { create, keyResolver, windowScheduler } from "@yornaath/batshit";
type User = { id: number; name: string };
const users = create({
fetcher: async (ids: number[]) => {
return client.users.where({
id_in: ids,
});
},
resolver: keyResolver("id"),
scheduler: windowScheduler(10), // Default and can be omitted.
});
/**
* Requests will be batched to one call since they are done within the same time window of 10 ms.
*/
const bob = users.fetch(1);
const alice = users.fetch(2);
const bobUndtAlice = await Promise.all([bob, alice]);
await delay(100);
/**
* New Requests will be batched in a another call since not within the first timeframe.
*/
const joe = users.fetch(3);
const margareth = users.fetch(4);
const joeUndtMargareth = await Promise.all([joe, margareth]);
`
React(query) Example
Here we are also creating a simple batcher that will batch all fetches made within a window of 10 ms into one request. Since all items are rendered in one go their individual fetches will be batched into one request.
Note: a batcher for a group of items should only be created once. So creating them inside hooks wont work as intended.
`ts
import { useQuery } from "react-query";
import { create, windowScheduler } from "@yornaath/batshit";
const users = create({
fetcher: async (ids: number[]) => {
return client.users.where({
userId_in: ids,
});
},
resolver: keyResolver("id"),
scheduler: windowScheduler(10),
});
const useUser = (id: number) => {
return useQuery(["users", id], async () => {
return users.fetch(id);
});
};
const UserDetails = (props: { userId: number }) => {
const { isFetching, data } = useUser(props.userId);
return (
<>
{isFetching ? (
Loading user {props.userId}
) : (
User: {data.name}
)}
>
);
};
/**
* Since all user details items are rendered within the window there will only be one request made.
*/
const UserList = () => {
const userIds = [1, 2, 3, 4];
return (
<>
{userIds.map((id) => (
))}
>
);
};
`
$3
We provide two helper functions for limiting the number of batched fetch calls.
#### windowedFiniteBatchScheduler
This will batch all calls made within a certain time frame UP to a certain max batch size before it starts a new batch
`ts
const batcher = batshit.create({
...,
scheduler: windowedFiniteBatchScheduler({
windowMs: 10,
maxBatchSize: 100,
}),
});
`
#### maxBatchSizeScheduler
Same as the one above, but will only wait indefinetly until the batch size is met.
`ts
const batcher = batshit.create({
...,
scheduler: maxBatchSizeScheduler({
maxBatchSize: 100,
}),
});
`
$3
In this example the response is an object/record with the id of the user as the key and the user object as the value.
Example:
`json
{
"1": {"username": "bob"},
"2": {"username": "alice"}
}
`
`ts
import * as batshit from "@yornaath/batshit";
const batcher = batshit.create({
fetcher: async (ids: string[]) => {
const users: Record = await fetchUserRecords(ids)
return users
},
resolver: batshit.indexedResolver(),
});
`
$3
If the batch fetcher needs some context like an sdk or client to make its fetching you can use a memoizer to make sure that you reuse a batcher for the given context in the hook calls.
`ts
import { useQuery } from "@tanstack/react-query";
import { memoize } from "lodash-es";
import * as batshit from "@yornaath/batshit";
export const key = "markets";
const batcher = memoize((sdk: Sdk) => {
return batshit.create({
name: key,
fetcher: async (ids: number[]) => {
const { markets } = await sdk.markets({
where: {
marketId_in: ids,
},
});
return markets;
},
scheduler: batshit.windowScheduler(10),
resolver: batshit.keyResolver("marketId"),
});
});
export const useMarket = (marketId: number) => {
const [sdk, id] = useSdk();
const query = useQuery(
[id, key, marketId],
async () => {
if(sdk) {
return batcher(sdk).fetch(marketId);
}
},
{
enabled: Boolean(sdk),
},
);
return query;
};
`
Custom Batch Resolver
This batcher will fetch all posts for multiple users in one request and resolve the correct list of posts for the discrete queries.
`ts
const userposts = create({
fetcher: async (queries: { authorId: number }) => {
return api.posts.where({
authorId_in: queries.map((q) => q.authorId),
});
},
scheduler: windowScheduler(10),
resolver: (posts, query) =>
posts.filter((post) => post.authorId === query.authorId),
});
const [alicesPosts, bobsPost] = await Promise.all([
userposts.fetch({authorId: 1})
userposts.fetch({authorId: 2})
]);
`
batcher.next() - Early Execution
Calling batcher.next() will execute the current batch early even if the scheduler hasnt finished.
`ts
const batcher = create({
fetcher: async (ids: number[]) => {
return mock.usersByIds(ids);
},
resolver: keyResolver("id"),
scheduler: windowScheduler(33),
});
let all = Promise.all([batcher.fetch(1), batcher.fetch(2), batcher.fetch(3), batcher.fetch(4)]);
bacher.next()
const users = await all //current batch of users [1,2,3,4] will be fetched immediately
`
Abort Signals - batcher.abort()
You can abort the current batch with batcher.abort()
Abort signals are sent down to fetcher that can be passed down to the underlying implementation.
`ts
test("aborting", async () => {
const batcher = create({
fetcher: async (ids: number[], signal: AbortSignal) => {
return fetch(/users?ids=${ids.join(",")}, { signal })
},
resolver: keyResolver("id"),
});
let all = Promise.all([batcher.fetch(1), batcher.fetch(2), batcher.fetch(3), batcher.fetch(4)]);
setTimeout(() => {
batcher.abort();
}, 5)
const error = await all.catch((error) => error);
expect(error).toBeInstanceOf(DOMException);
expect(error.message).toBe("Aborted");
})
`
Indexed Keyresolver Performance
If your batches are big arrays( > 30K items) and you use the keyresolver it can give you a performance boost to turn on indexing.
__Any less than 30K items per batch and the performance gain is negligeble. But anywhere above 15K can be worth it.__
`ts
const batcherIndexed = create({
fetcher: async (ids: number[]) => {
// returns > 30K items[]
return mock.bigUserById(ids);
},
resolver: keyResolver("id", { indexed: true }),
scheduler: windowScheduler(1000),
});
`
React Devtools
Tools to debug and inspect the batching process can be found in the @yornaath/batshit-devtools-react package.
`bash
yarn add @yornaath/batshit-devtools @yornaath/batshit-devtools-react
`
`ts
import { create, keyResolver, windowScheduler } from "@yornaath/batshit";
import BatshitDevtools from "@yornaath/batshit-devtools-react";
const batcher = create({
fetcher: async (queries: number[]) => {...},
scheduler: windowScheduler(10),
resolver: keyResolver("id"),
name: "batcher:data" // used in the devtools to identify a particular batcher.
});
const App = () => {
}
``