Full type-safe RPC library for service worker -- move things off of the UI thread with ease!
npm install swarpcRPC for Service (and other types of) Workers -- move that heavy computation off of your UI thread!
- Fully typesafe
- Lightweight: no dependencies, less than 5 kB (minified+gzipped)
- Supports any Standard Schema-compliant validation library (ArkType, Zod, Valibot, etc.)
- Cancelable requests
- Parallelization with multiple worker instances
- Automatic transfer of transferable values from- and to- worker code
- A way to polyfill a pre-filled localStorage to be accessed within the worker code
- First-class support for signaling progress updates (and e.g. display a progress bar)
- Supports Service workers, Shared workers and Dedicated workers
``bash`
npm add swarpc
Also add a Standard-Schema-compliant validation library of your choosing
`bash`For example
npm add arktype
Bleeding edge
If you want to use the latest commit instead of a published version, you can, either by using the Git URL:
`bash`
npm add git+https://github.com/gwennlbh/swarpc.git
Or by straight up cloning the repository and pointing to the local directory (very useful to hack on sw&rpc while testing out your changes on a more substantial project):
`bash`
mkdir -p vendored
git clone https://github.com/gwennlbh/swarpc.git vendored/swarpc
npm add file:vendored/swarpc
This works thanks to the fact that dist/ is published on the repository (and kept up to date with a CI workflow).
> [!NOTE]
> We use ArkType in the following examples, but, as stated above, any validation library
> is a-okay (provided that it is Standard Schema v1-compliant)
`typescript
import type { ProceduresMap } from "swarpc";
import { type } from "arktype";
export const procedures = {
searchIMDb: {
// Input for the procedure
input: type({ query: "string", "pageSize?": "number" }),
// Function to be called whenever you can update progress while the procedure is running -- long computations are a first-class concern here. Examples include using the fetch-progress NPM package.
progress: type({ transferred: "number", total: "number" }),
// Output of a successful procedure call
success: type({
id: "string",
primary_title: "string",
genres: "string[]",
}).array(),
},
} as const satisfies ProceduresMap;
`
In your worker file:
`javascript
import fetchProgress from "fetch-progress"
import { Server } from "swarpc"
import { procedures } from "./procedures.js"
// 1. Give yourself a server instance
const swarpc = Server(procedures)
// 2. Implement your procedures
swarpc.searchIMDb(async ({ query, pageSize = 10 }, onProgress) => {
const queryParams = new URLSearchParams({
page_size: pageSize.toString(),
query,
})
return fetch(https://rest.imdbapi.dev/v2/search/titles?${queryParams})
.then(fetchProgress({ onProgress }))
.then((response) => response.json())
.then(({ titles } => titles)
})
// ...
// 3. Start the event listener
swarpc.start(self)
`
Here's a Svelte example!
`svelte
{#if progress > 0 && progress < 1}
{/if}
$3
#### Service Workers
If you use SvelteKit, just name your service worker file
src/service-worker.ts_If you use any other (meta) framework, please contribute usage documentation here :)_
#### Dedicated or Shared Workers
Preferred over service workers for heavy computations, since you can run multiple instances of them (see Configure parallelism)
If you use Vite, you can import files as Web Worker classes:
`ts
import { Client } from "swarpc";
import { procedures } from "$lib/off-thread/procedures.ts";
import OffThreadWorker from "$lib/off-thread/worker.ts?worker";const client = Client(procedures, {
worker: OffThreadWorker, // don't instanciate the class, sw&rpc does it
});
`$3
By default, when a
worker is passed to the Client's options, the client will automatically spin up navigator.hardwareConcurrency worker instances and distribute requests among them. You can customize this behavior by setting the Client:options.nodes option to control the number of _nodes_ (worker instances).When
Client:options.worker is not set, the client will use the Service worker (and thus only a single instance).#### Send to multiple nodes
Use
Client#(method name).broadcast to send the same request to all nodes at once. This method returns a Promise that resolves to an array of PromiseSettledResult (with an additional property, node, the ID of the node the request was sent to), one per node the request was sent to.For example:
`ts
const client = Client(procedures, {
worker: MyWorker,
nodes: 4,
});for (const result of await client.initDB.broadcast("localhost:5432")) {
if (result.status === "rejected") {
console.error(
Could not initialize database on node ${result.node},
result.reason,
);
}
}
`You also have a very convenient way to aggregate the results of all nodes, if you don't need to handle errors in a fine-grained way:
`ts
const userbase = await client.tableSize.broadcast
.orThrow("users")
.then((counts) => sum(counts))
.catch((e) => {
// e is an AggregateError with every failing node's error
console.error("Could not get total user count:", e);
});
`Otherwise, you have access to a handful of convenience properties on the returned array, to help you narrow down what happened on each node:
`ts
async function userbase() {
const counts = await client.tableSize.broadcast("users"); if (counts.ko) {
throw new Error(
All nodes failed to get table size: ${counts.failureSummary},
);
} return {
exact: counts.ok,
count:
sum(counts.successes) +
average(counts.successes) * counts.failures.length,
};
}
`$3
#### Implementation
To make your procedures meaningfully cancelable, you have to make use of the
AbortSignal API. This is passed as a third argument when implementing your procedures:`js
server.searchIMDb(async ({ query }, onProgress, { abortSignal }) => {
// If you're doing heavy computation without fetch:
// Use abortSignal?.throwIfAborted() within hot loops and at key points
for (...) {
abortSignal?.throwIfAborted();
...
} // When using fetch:
await fetch(..., { signal: abortSignal })
})
`#### Call sites
Instead of calling
await client.myProcedure() directly, call client.myProcedure.cancelable(). You'll get back an object with-
async cancel(reason): a function to cancel the request
- request: a Promise that resolves to the result of the procedure call. await it to wait for the request to finish.Example:
`js
// Normal call:
const result = await swarpc.searchIMDb({ query });// Cancelable call:
const { request, cancel } = swarpc.searchIMDb.cancelable({ query });
setTimeout(() => cancel().then(() => console.warn("Took too long!!")), 5_000);
await request;
`$3
The "once" mode allows you to automatically cancel any previous ongoing call before running a new one. This is useful for scenarios like search-as-you-type, where you only care about the latest request.
#### Method-scoped once mode
Cancel any previous call of the same method:
`js
// If any previous call of searchIMDb is ongoing, it gets cancelled beforehand
const result = await swarpc.searchIMDb.once({ query });
`#### Method-scoped once mode with key
Cancel any previous call of the same method with the same key:
`js
// If any previous call of searchIMDb with "foo" as the key is ongoing,
// it gets cancelled beforehand
const result = await swarpc.searchIMDb.onceBy("foo", { query });
`This allows multiple concurrent calls with different keys:
`js
// These two calls can run concurrently
const result1 = await swarpc.searchIMDb.onceBy("search-bar", {
query: "action",
});
const result2 = await swarpc.searchIMDb.onceBy("sidebar", { query: "comedy" });
`#### Global once mode
Cancel any ongoing call with the same global key, across all methods:
`js
// Any call from ANY procedure with "global-search" key gets cancelled beforehand
const result = await swarpc.onceBy("global-search").searchIMDb({ query });
`This is useful when you want to ensure only one operation of a certain type is running at a time, regardless of which procedure is being called.
#### With broadcasting
You can combine "once" mode with broadcasting as well, just use
.broadcast.once or .broadcast.onceBy instead of .once or .onceBy:`js
// Load the inference model on all nodes. If we call this again before the previous model finishes loading,
// the previous load requests get cancelled.
await swarpc.loadInferenceModel.broadcast.once({ url });
`$3
You might call third-party code that accesses on
localStorage from within your procedures.Some workers don't have access to the browser's
localStorage, so you'll get an error.You can work around this by specifying to swarpc localStorage items to define on the Server, and it'll create a polyfilled
localStorage with your data.An example use case is using Paraglide, a i18n library, with the
localStorage strategy:`js
// In the client
import { getLocale } from "./paraglide/runtime.js";const swarpc = Client(procedures, {
localStorage: {
PARAGLIDE_LOCALE: getLocale(),
},
});
await swarpc.myProcedure(1, 0);
// In the server
import { m } from "./paraglide/runtime.js";
const swarpc = Server(procedures);
swarpc.myProcedure(async (a, b) => {
if (b === 0) throw new Error(m.cannot_divide_by_zero());
return a / b;
});
``