Simple parallel execution library for Node.js using Worker Threads. Execute CPU-intensive tasks with thread pools and isolated workers. Supports both regular and arrow functions.
npm install stardust-parallel-js







A library for parallel execution of JavaScript/TypeScript functions using Worker Threads in Node.js.
Benchmarks on 4-core CPU:
| Task | Sequential | Parallel (4 workers) | Speedup |
|------|-----------|---------------------|---------|
| Fibonacci(35-42) computation | 5113 ms | 2606 ms | 1.96x |
| Processing 50 items | 936 ms | 344 ms | 2.72x |
> Performance improvement: up to 63% on CPU-intensive tasks.
- Real speedup on multi-core processors
- Simple API for parallel task execution
- Thread pool for efficient resource management
- Single thread support for one-off tasks
- Persistent threads for long-running background tasks
- Worker prewarming to reduce thread creation overhead
- Task TTL (timeout) support for better control
- Pool statistics and monitoring with getStats()
- Full TypeScript support
- Automatic recovery of crashed threads
- Array processing similar to map(), but parallel
- Automatic extraction and transfer of Transferable objects
``bash`
npm install stardust-parallel-jsor
pnpm install stardust-parallel-jsor
yarn add stardust-parallel-js
Basic usage example:
`typescript
import { ThreadPool } from 'stardust-parallel-js';
const pool = new ThreadPool(4);
// Sequential execution
const results = data.map(item => heavyComputation(item));
// Parallel execution
const results = await pool.map(data, item => heavyComputation(item));
await pool.terminate();
`
Use ThreadPool to process multiple tasks with maximum efficiency:
`typescript
import { ThreadPool } from 'stardust-parallel-js';
// Create a pool of 4 threads (matching CPU cores)
const pool = new ThreadPool(4);
// Process array in parallel
const numbers = [1, 2, 3, 4, 5, 6, 7, 8];
const squares = await pool.map(numbers, (n: number) => n * n);
console.log(squares); // [1, 4, 9, 16, 25, 36, 49, 64]
// CPU-intensive computations
const result = await pool.execute(
(n: number) => {
let sum = 0;
for (let i = 0; i < n; i++) {
sum += Math.sqrt(i);
}
return sum;
},
[1000000]
);
// Release resources
await pool.terminate();
`
Use Thread for one-off operations:
`typescript
import { Thread } from 'stardust-parallel-js';
// Execute function and wait for result
const thread = Thread.execute(
(text: string) => text.toUpperCase(),
['hello world']
);
const result = await thread.join();
console.log(result); // "HELLO WORLD"
// Arrow functions work!
const thread2 = Thread.execute(x => x * 2, [21]);
console.log(await thread2.join()); // 42
// With TTL (timeout in milliseconds)
const thread3 = Thread.execute(
(n: number) => {
// Some heavy computation
return n * 2;
},
[42],
5000 // 5 seconds timeout
);
// Persistent thread for long-running tasks
const persistent = Thread.persistent(
() => {
setInterval(() => {
console.log('Running...');
}, 1000);
},
[]
);
persistent.onError((error) => {
console.error('Thread error:', error);
});
// Don't forget to terminate when done
persistent.terminate();
`
`typescript
import { ThreadPool } from 'stardust-parallel-js';
const pool = new ThreadPool(8);
const images = ['img1.jpg', 'img2.jpg', / ... / 'img100.jpg'];
// Process 100 images in parallel
const processed = await pool.map(images, (path: string) => {
const fs = require('fs');
const sharp = require('sharp');
// Complex image processing
return processImage(path);
});
await pool.terminate();
`
`typescript
const pool = new ThreadPool(4);
const chunks = splitDataIntoChunks(bigData, 1000);
// Parse each chunk in parallel
const parsed = await pool.map(chunks, (chunk: any[]) => {
return chunk.map(item => parseComplexData(item));
});
await pool.terminate();
`
`typescript
const pool = new ThreadPool(4);
const results = await pool.map([35, 36, 37, 38, 39, 40], n => {
function fibonacci(num: number): number {
if (num <= 1) return num;
return fibonacci(num - 1) + fibonacci(num - 2);
}
return fibonacci(n);
});
await pool.terminate();
`
Prewarm workers to eliminate thread creation overhead for faster execution:
`typescript
import { Thread } from 'stardust-parallel-js';
import os from 'os';
// Prewarm workers at application startup
Thread.prewarm(os.cpus().length);
// Now all Thread.execute() calls will reuse prewarmed workers
// This is much faster than creating workers on-demand
const tasks = Array.from({ length: 100 }, (_, i) => i);
const results = await Promise.all(
tasks.map(async (n) => {
const thread = Thread.execute((x: number) => x * x, [n]);
return await thread.join();
})
);
console.log('All tasks completed:', results.length);
// Clean up when shutting down the application
Thread.clearPool();
`
When to use prewarming:
- Processing many small tasks with Thread.execute()
- Applications with predictable workloads
- Long-running services where startup time matters
- Reducing latency for first requests
Comparison:
`typescript`
// Without prewarming: ~50ms per task (includes worker creation)
// With prewarming: ~5ms per task (workers already ready)
Run benchmarks:
`bash`
npm run build
npx tsx benchmarks/cpu-intensive.ts
npx tsx benchmarks/data-processing.ts
#### constructor(size: number, ttl?: number)
Creates a thread pool of the specified size.
- size - Number of worker threads in the poolttl
- - (optional) Default time to live in milliseconds for tasks
`typescript
// Without TTL
const pool = new ThreadPool(4);
// With default TTL of 10 seconds
const pool = new ThreadPool(4, 10000);
`
#### execute
Executes a function in an available thread from the pool.
- ttl - (optional) Time to live in milliseconds. Task will be rejected if it waits in queue longer than TTL.
`typescript
const result = await pool.execute((x: number) => x * x, [5]);
// With TTL - reject if task waits more than 3 seconds
const result = await pool.execute((x: number) => x * x, [5], 3000);
`
#### map
Applies a function to each array element in parallel.
- ttl - (optional) Time to live in milliseconds for each task.
`typescript
// Without TTL
const results = await pool.map([1, 2, 3], n => n * 2);
// With TTL
const results = await pool.map([1, 2, 3], n => n * 2, 5000);
`
#### getStats(): { totalWorkers: number; availableWorkers: number; busyWorkers: number; queuedTasks: number }
Returns current pool statistics.
`typescriptBusy workers: ${stats.busyWorkers}/${stats.totalWorkers}
const stats = pool.getStats();
console.log();Queued tasks: ${stats.queuedTasks}
console.log();`
#### terminate(): Promise
Stops all threads and releases resources.
`typescript`
await pool.terminate();
#### Thread.execute
Creates a new thread to execute a function once.
- ttl - (optional) Time to live in milliseconds. Thread will be terminated if execution takes longer.
`typescript
const thread = Thread.execute((x: number) => x * x, [5]);
const result = await thread.join();
// With TTL
const thread = Thread.execute(
(x: number) => x * x,
[5],
5000 // 5 seconds timeout
);
`
#### Thread.persistent
Creates a persistent thread for long-running tasks.
`typescript
const persistent = Thread.persistent(() => {
setInterval(() => {
console.log('Background task');
}, 1000);
}, []);
persistent.onError((error) => {
console.error('Thread error:', error);
});
persistent.terminate(); // Don't forget to clean up
`
#### Thread.prewarm(count?: number): void
Prewarms a pool of workers to reduce thread creation overhead.
`typescript
// Prewarm 4 workers (default)
Thread.prewarm();
// Prewarm specific number of workers
Thread.prewarm(8);
// Now Thread.execute() will reuse prewarmed workers
const thread = Thread.execute((x: number) => x * 2, [21]);
`
#### Thread.clearPool(): void
Clears the prewarmed worker pool and terminates all workers.
`typescript`
Thread.clearPool();
#### ExecutableThread.join(): Promise
Waits for execution to complete and returns the result. Automatically terminates or returns thread to pool.
`typescript`
const result = await thread.join();
#### PersistentThread.onError(callback: (error: Error) => void): this
Registers an error handler for persistent threads.
`typescript`
persistent.onError((error) => {
console.error('Error in persistent thread:', error);
});
#### PersistentThread.terminate(): void
Terminates a persistent thread.
`typescript`
persistent.terminate();
- Functions execute in an isolated context (separate Worker Thread)
- Arguments and results must be serializable
- Closures don't work - functions have no access to external variables
- Regular and arrow functions are supported
- require() is available inside functions for using Node.js modules
- Best suited for CPU-intensive tasks (calculations, data processing)
- For I/O operations (reading files, network) use async/await instead of threads
Use stardust-parallel-js when:
- Processing large data arrays
- Performing complex calculations
- Parsing or transforming data
- Processing images/video
- Need to utilize all CPU cores
Don't use when:
- Simple operations (faster to execute sequentially)
- I/O operations (files, network, DB) - they're already asynchronous
- Working with DOM (Node.js only)
`typescript
import os from 'os';
// Optimal: number of CPU cores
const pool = new ThreadPool(os.cpus().length);
// For CPU-intensive tasks
const pool = new ThreadPool(os.cpus().length - 1); // leave 1 core for system
// For mixed workload
const pool = new ThreadPool(os.cpus().length * 2);
``
| Solution | Simplicity | Performance | TypeScript | Size |
|----------|-----------|-------------|------------|------|
| stardust-parallel-js | High | High | Full | 9.3kB |
| worker_threads | Medium | High | Partial | Built-in |
| cluster | Medium | Medium | Partial | Built-in |
| child_process | Low | Low | No | Built-in |
- [x] Support for transferable objects for large data
- [x] Task TTL (time to live) support
- [x] Persistent threads for long-running tasks
- [x] Worker prewarming for reduced overhead
- [x] Pool statistics and monitoring
- [ ] Automatic selection of optimal pool size
- [ ] Task prioritization
- [ ] Support for async functions in threads
Found a bug or have an idea? Create an issue.
- Node.js >= 14.0.0 (with Worker Threads support)
MIT © b1411