A TypeScript library that brings Go-like concurrency patterns to Node.js, featuring goroutines, channels, and worker pools.
npm install @eklabdev/gochanA TypeScript library that brings Go-like concurrency patterns to Node.js, featuring goroutines, channels, and worker pools.
- Node.js >= 22.7.0
- TypeScript >= 5.0.0
- Go-style goroutines with go and goShared functions
- Shared channels for inter-thread communication
- Worker pool management
- WaitGroup for goroutine synchronization
- Select-like functionality for channel operations
``bash`
npm install @eklabdev/gochan
or using yarn:
`bash`
yarn add @eklabdev/gochan
`typescript
import { go, goShared, makeChan, registerChannel, initializeGoroutines } from 'gochan';
// Initialize the worker pool (optional, defaults to number of CPU cores)
initializeGoroutines(4);
// Simple goroutine example
const result = await go(async () => {
return "Hello from worker thread!";
});
// Channel communication example
async function channelExample() {
const messageChannel = makeChan
const resultChannel = makeChan
registerChannel('messages', messageChannel);
registerChannel('results', resultChannel);
// Producer goroutine
const producer = goShared(async (sharedChan) => {
const msgChan = sharedChan('messages');
for (let i = 1; i <= 5; i++) {
await msgChan.send(Message ${i});Processed: ${message}
}
msgChan.close();
});
// Consumer goroutine
const consumer = goShared(async (sharedChan) => {
const msgChan = sharedChan('messages');
const resChan = sharedChan('results');
for await (const message of msgChan) {
await resChan.send();`
}
resChan.close();
});
// Collect results
const results: string[] = [];
for await (const result of resultChannel) {
results.push(result);
}
await Promise.all([producer, consumer]);
return results;
}
`typescript
// Regular goroutine
const result = await go(async () => {
// Your code here
});
// Goroutine with shared channel access
const result = await goShared(async (sharedChan) => {
const channel = sharedChan('channel-name');
// Your code here
});
`
`typescript
// Create a channel
const channel = makeChan
// Register channel for worker access
registerChannel('channel-name', channel);
// Send data
await channel.send('data');
// Receive data
const data = await channel.receive();
// Iterate over channel
for await (const item of channel) {
// Process item
}
// Close channel
channel.close();
`
When working with complex objects or large data structures, you can specify a custom element size for the channel buffer:
`typescript
import { makeChan, calculateElementSize } from 'gochan';
// Define a complex data structure
interface UserData {
id: number;
name: string;
email: string;
metadata: {
lastLogin: Date;
preferences: Record
};
}
// Calculate the size needed for each element
const elementSize = calculateElementSize
id: 8, // number (8 bytes)
name: 100, // string (max 100 chars)
email: 100, // string (max 100 chars)
metadata: {
lastLogin: 8, // Date (8 bytes)
preferences: 500 // object (max 500 bytes)
}
});
// Create a channel with custom element size
const userChannel = makeChan
// Register the channel
registerChannel('users', userChannel);
// Use the channel
await userChannel.send({
id: 1,
name: 'John Doe',
email: 'john@example.com',
metadata: {
lastLogin: new Date(),
preferences: { theme: 'dark', notifications: true }
}
});
`
The calculateElementSize function helps ensure that the channel buffer has enough space for each element. This is particularly important when:
- Working with large objects
- Handling variable-sized data
- Need to optimize memory usage
- Dealing with complex data structures
`typescript
const wg = new WaitGroup();
// Add goroutines to wait group
wg.add(go(async () => { / ... / }));
wg.add(go(async () => { / ... / }));
// Wait for all goroutines to complete
const results = await wg.wait();
`
`typescript
// Goroutine functions
go
goShared
// Channel functions
makeChan
registerChannel
// Worker management
initializeGoroutines(maxWorkers?: number): void
shutdown(): Promise
`
`typescript
// Channel class
class SharedChannel
send(data: T): Promise
receive(): Promise
close(): void
isClosed(): boolean
hasData(): boolean
}
// Worker pool
class WorkerPool {
constructor(maxWorkers?: number)
execute
shutdown(): Promise
}
// WaitGroup for synchronization
class WaitGroup {
add(promise: Promise
wait(): Promise
}
`
`typescript`
// Calculate buffer size for complex objects
calculateElementSize
1. Always initialize goroutines at the start of your application:
`typescript`
initializeGoroutines(4); // or number of CPU cores
2. Register channels before using them in workers:
`typescript`
const channel = makeChan
registerChannel('my-channel', channel);
3. Use proper error handling in goroutines:
`typescript`
go(async () => {
try {
// Your code
} catch (error) {
// Handle error
}
});
4. Close channels when done:
`typescript`
try {
// Use channel
} finally {
channel.close();
}
5. Use WaitGroup for managing multiple goroutines:
`typescript`
const wg = new WaitGroup();
wg.add(go(async () => { / ... / }));
await wg.wait();
Check the examples directory for more detailed examples:
- Basic goroutine usage
- Channel communication
- Fan-out pattern
- Pipeline processing
- Worker pool
- And more!
The library includes built-in benchmarks to measure performance across different concurrency patterns. Run the benchmarks using:
`bash`
npm run benchmark
The benchmark suite includes three main scenarios:
1. Basic Parallel Tasks
- Compares single-threaded vs goroutine execution
- Processes multiple tasks with random delays
- Typical speedup: 3-4x
2. Producer-Consumer Pattern
- Tests channel communication performance
- Multiple producers and consumers
- Typical speedup: 3-4x
3. Worker Pool Pattern
- Evaluates worker pool efficiency
- Multiple workers processing jobs in parallel
- Typical speedup: 2-3x
Example benchmark output:
`
=== Goroutine vs Single-threaded Benchmark ===
1. Basic Parallel Tasks:
Single-threaded execution:
Total time: 1123.39ms
Average time per task: 112.34ms
Goroutine execution:
Total time: 305.54ms
Average time per task: 30.55ms
Speedup factor: 3.68x
2. Producer-Consumer Pattern:
Single-threaded processing:
Total time: 1137.53ms
Average time per message: 1.14ms
Goroutine processing:
Total time: 344.16ms
Average time per message: 0.34ms
Speedup factor: 3.31x
3. Worker Pool Pattern:
Single-threaded processing:
Total time: 1137.22ms
Average time per job: 1.14ms
Goroutine processing:
Total time: 415.16ms
Average time per job: 0.42ms
Speedup factor: 2.74x
`
Note: Actual performance may vary based on:
- System hardware (CPU cores, memory)
- Current system load
- Task complexity and size
- Node.js version and configuration
Contributions are welcome! Please feel free to submit a Pull Request.
MIT
with a preloaded script
- Only Node.js built-in globals are supported (as documented in Node.js Globals)
- No support for importing external modules or using require` in worker threads