AudioWorklet integration for Ferrite Audio noise reduction
npm install @ferrite-audio/worklet


AudioWorklet integration for Ferrite Audio processing engine. Provides ultra-low latency, real-time audio processing directly in the audio rendering thread using WebAssembly.
- 🎯 Ultra-Low Latency - Runs in dedicated audio thread with minimal overhead
- 🚀 Real-Time Processing - Sample-accurate processing at audio rate
- 🔧 Easy Integration - Simple API that works like native Web Audio nodes
- 📊 Performance Monitoring - Built-in statistics and performance metrics
- 🎛️ Dynamic Configuration - Update parameters without audio interruption
- 🔌 Seamless Connection - Works with any Web Audio API graph
``bash`
npm install @ferrite-audio/worklet @ferrite-audio/core
Or with yarn:
`bash`
yarn add @ferrite-audio/worklet @ferrite-audio/core
`javascript
import { createFerriteWorklet } from '@ferrite-audio/worklet';
// Create audio context
const audioContext = new AudioContext();
// Create and initialize worklet
const worklet = await createFerriteWorklet(audioContext, {
enableGate: true,
enableSpectral: true,
reductionAmount: 0.6
});
// Connect to audio graph
microphone.connect(worklet.getNode());
worklet.connect(audioContext.destination);
`
Main class for managing AudioWorklet-based processing.
#### Constructor
`typescript`
new FerriteWorklet(context: AudioContext, config?: WorkletConfig)
WorkletConfig:
`typescript`
interface WorkletConfig {
// Processing parameters
sampleRate?: number; // Audio sample rate
enableGate?: boolean; // Enable noise gate
gateThreshold?: number; // Gate threshold in dB (-60 to 0)
enableSpectral?: boolean; // Enable spectral subtraction
reductionAmount?: number; // Noise reduction (0.0 to 1.0)
wienerFilter?: boolean; // Enable Wiener filtering
// Worklet configuration
processorUrl?: string; // Custom processor script URL
wasmUrl?: string; // Custom WASM file URL
wasmBytes?: ArrayBuffer; // Pre-loaded WASM bytes
bufferSize?: number; // Internal buffer size
}
#### Methods
##### initialize(): Promise
Initialize the AudioWorklet and load WASM module.
`javascript`
const worklet = new FerriteWorklet(audioContext);
await worklet.initialize();
##### connect(destination: AudioNode | AudioParam): AudioNode
Connect to an audio node or parameter.
`javascript`
worklet.connect(audioContext.destination);
worklet.connect(gainNode);
worklet.connect(analyser);
##### disconnect(destination?: AudioNode | AudioParam): void
Disconnect from audio nodes.
`javascript`
worklet.disconnect(); // Disconnect all
worklet.disconnect(gainNode); // Disconnect specific
##### updateConfig(config: Partial
Update processor configuration dynamically.
`javascript`
await worklet.updateConfig({
reductionAmount: 0.8,
enableGate: true
});
##### learnNoise(samples: Float32Array): Promise
Learn noise profile from audio samples.
`javascript`
// Capture 1 second of noise
const noiseSamples = await captureNoise(1000);
await worklet.learnNoise(noiseSamples);
##### reset(): Promise
Reset processor state and statistics.
`javascript`
await worklet.reset();
##### getStats(): Promise
Get processing statistics.
`javascriptLatency: ${stats.latency}ms
const stats = await worklet.getStats();
console.log();CPU: ${stats.cpuTime}ms
console.log();`
##### dispose(): void
Clean up resources.
`javascript`
worklet.dispose();
##### getNode(): AudioWorkletNode | null
Get the underlying AudioWorkletNode.
`javascript`
const node = worklet.getNode();
node.parameters.get('gain').value = 0.5;
##### isInitialized(): boolean
Check initialization status.
`javascript`
if (worklet.isInitialized()) {
// Ready to process
}
#### createFerriteWorklet(context: AudioContext, config?: WorkletConfig): Promise
Create and initialize a worklet in one call.
`javascript`
const worklet = await createFerriteWorklet(audioContext, {
enableSpectral: true,
reductionAmount: 0.7
});
`javascript
import { createFerriteWorklet } from '@ferrite-audio/worklet';
async function setupNoiseReduction() {
const audioContext = new AudioContext();
// Get user microphone
const stream = await navigator.mediaDevices.getUserMedia({
audio: true
});
const source = audioContext.createMediaStreamSource(stream);
// Create Ferrite worklet
const worklet = await createFerriteWorklet(audioContext, {
enableGate: true,
gateThreshold: -40,
enableSpectral: true,
reductionAmount: 0.6
});
// Connect audio graph
source.connect(worklet.getNode());
worklet.connect(audioContext.destination);
}
`
`javascript
import { FerriteWorklet } from '@ferrite-audio/worklet';
class AdaptiveNoiseReducer {
constructor(context) {
this.context = context;
this.worklet = new FerriteWorklet(context);
this.isLearning = false;
this.noiseBuffer = [];
}
async initialize() {
await this.worklet.initialize();
}
startLearning() {
this.isLearning = true;
this.noiseBuffer = [];
}
async stopLearning() {
this.isLearning = false;
// Combine noise samples
const totalLength = this.noiseBuffer.reduce((sum, buf) =>
sum + buf.length, 0
);
const combined = new Float32Array(totalLength);
let offset = 0;
for (const buffer of this.noiseBuffer) {
combined.set(buffer, offset);
offset += buffer.length;
}
// Learn noise profile
await this.worklet.learnNoise(combined);
}
process(inputBuffer) {
if (this.isLearning) {
this.noiseBuffer.push(new Float32Array(inputBuffer));
}
// Processing happens automatically in worklet
}
}
`
`javascript
import { createFerriteWorklet } from '@ferrite-audio/worklet';
async function setupVoiceCall(peerConnection) {
const audioContext = new AudioContext({
latencyHint: 'interactive',
sampleRate: 48000
});
// Create worklet with voice-optimized settings
const worklet = await createFerriteWorklet(audioContext, {
enableGate: true,
gateThreshold: -38,
enableSpectral: true,
reductionAmount: 0.7,
wienerFilter: false // Preserve voice clarity
});
// Get user media
const stream = await navigator.mediaDevices.getUserMedia({
audio: {
echoCancellation: true,
noiseSuppression: false, // Use Ferrite instead
autoGainControl: true,
sampleRate: 48000
}
});
// Process audio
const source = audioContext.createMediaStreamSource(stream);
const destination = audioContext.createMediaStreamDestination();
source.connect(worklet.getNode());
worklet.connect(destination);
// Add to peer connection
const processedTrack = destination.stream.getAudioTracks()[0];
peerConnection.addTrack(processedTrack, destination.stream);
return { worklet, audioContext };
}
`
`javascript
import { createFerriteWorklet } from '@ferrite-audio/worklet';
class AudioProcessor {
constructor() {
this.worklet = null;
this.statsInterval = null;
}
async initialize(context) {
this.worklet = await createFerriteWorklet(context, {
enableSpectral: true,
reductionAmount: 0.5
});
// Start monitoring
this.startMonitoring();
}
startMonitoring() {
this.statsInterval = setInterval(async () => {
const stats = await this.worklet.getStats();
// Update UI
this.updateDisplay({
latency: stats.latency.toFixed(2),
cpu: stats.cpuTime.toFixed(2),
processed: stats.processed,
dropped: stats.dropped,
dropRate: (stats.dropped / stats.processed * 100).toFixed(2)
});
// Alert if performance degrades
if (stats.latency > 20) {
console.warn('High latency detected:', stats.latency);
}
if (stats.dropped > 0) {
console.warn('Dropped frames:', stats.dropped);
}
}, 1000);
}
updateDisplay(stats) {
document.getElementById('latency').textContent = ${stats.latency}ms;${stats.cpu}ms
document.getElementById('cpu').textContent = ;${stats.dropRate}%
document.getElementById('drop-rate').textContent = ;`
}
dispose() {
if (this.statsInterval) {
clearInterval(this.statsInterval);
}
if (this.worklet) {
this.worklet.dispose();
}
}
}
`javascript
import { createFerriteWorklet } from '@ferrite-audio/worklet';
async function createAdaptiveProcessor(context) {
const worklet = await createFerriteWorklet(context, {
enableGate: true,
enableSpectral: true,
reductionAmount: 0.5
});
// Create UI controls
const controls = {
gate: document.getElementById('gate-enable'),
threshold: document.getElementById('gate-threshold'),
spectral: document.getElementById('spectral-enable'),
reduction: document.getElementById('reduction-amount'),
wiener: document.getElementById('wiener-filter')
};
// Update configuration on change
const updateConfig = async () => {
await worklet.updateConfig({
enableGate: controls.gate.checked,
gateThreshold: parseFloat(controls.threshold.value),
enableSpectral: controls.spectral.checked,
reductionAmount: parseFloat(controls.reduction.value) / 100,
wienerFilter: controls.wiener.checked
});
};
// Attach listeners
Object.values(controls).forEach(control => {
control.addEventListener('change', updateConfig);
control.addEventListener('input', updateConfig);
});
return worklet;
}
`
`javascript
import { createFerriteWorklet } from '@ferrite-audio/worklet';
async function setupMultiChannel(context) {
// Create splitter and merger
const splitter = context.createChannelSplitter(2);
const merger = context.createChannelMerger(2);
// Create worklet for each channel
const leftWorklet = await createFerriteWorklet(context, {
enableSpectral: true,
reductionAmount: 0.6
});
const rightWorklet = await createFerriteWorklet(context, {
enableSpectral: true,
reductionAmount: 0.6
});
// Connect graph
source.connect(splitter);
splitter.connect(leftWorklet.getNode(), 0);
splitter.connect(rightWorklet.getNode(), 1);
leftWorklet.connect(merger, 0, 0);
rightWorklet.connect(merger, 0, 1);
merger.connect(context.destination);
return { leftWorklet, rightWorklet };
}
`
`javascript
import { createFerriteWorklet } from '@ferrite-audio/worklet';
class AudioRecorder {
constructor(context) {
this.context = context;
this.worklet = null;
this.mediaRecorder = null;
this.chunks = [];
}
async initialize() {
// Create worklet
this.worklet = await createFerriteWorklet(this.context, {
enableGate: true,
gateThreshold: -45,
enableSpectral: true,
reductionAmount: 0.4
});
// Get user media
const stream = await navigator.mediaDevices.getUserMedia({
audio: true
});
const source = this.context.createMediaStreamSource(stream);
// Create destination for recording
const destination = this.context.createMediaStreamDestination();
// Connect with noise reduction
source.connect(this.worklet.getNode());
this.worklet.connect(destination);
// Setup media recorder
this.mediaRecorder = new MediaRecorder(destination.stream);
this.mediaRecorder.ondataavailable = (event) => {
this.chunks.push(event.data);
};
this.mediaRecorder.onstop = () => {
const blob = new Blob(this.chunks, { type: 'audio/webm' });
this.onRecordingComplete(blob);
};
}
start() {
this.chunks = [];
this.mediaRecorder.start();
}
stop() {
this.mediaRecorder.stop();
}
onRecordingComplete(blob) {
// Override this method
const url = URL.createObjectURL(blob);
console.log('Recording complete:', url);
}
dispose() {
if (this.worklet) {
this.worklet.dispose();
}
}
}
`
`javascript`
// Host processor script on CDN
const worklet = await createFerriteWorklet(audioContext, {
processorUrl: 'https://cdn.example.com/ferrite-processor.js',
wasmUrl: 'https://cdn.example.com/ferrite.wasm'
});
`javascript
// Pre-load WASM for faster initialization
const wasmResponse = await fetch('/ferrite.wasm');
const wasmBytes = await wasmResponse.arrayBuffer();
const worklet = await createFerriteWorklet(audioContext, {
wasmBytes, // Use pre-loaded bytes
enableSpectral: true
});
`
`javascript`
try {
const worklet = await createFerriteWorklet(audioContext, config);
} catch (error) {
if (error.message.includes('AudioWorklet')) {
console.error('AudioWorklet not supported');
// Fall back to ScriptProcessor
} else if (error.message.includes('WASM')) {
console.error('WebAssembly loading failed');
// Fall back to JavaScript implementation
}
}
`javascript
class AudioSession {
constructor() {
this.worklets = [];
}
async createWorklet(context, config) {
const worklet = await createFerriteWorklet(context, config);
this.worklets.push(worklet);
return worklet;
}
dispose() {
// Clean up all worklets
for (const worklet of this.worklets) {
worklet.dispose();
}
this.worklets = [];
}
}
// Use in application
const session = new AudioSession();
const worklet = await session.createWorklet(context, config);
// Clean up when done
window.addEventListener('beforeunload', () => {
session.dispose();
});
`
`javascript
// Optimize for different scenarios
const configs = {
ultraLowLatency: {
bufferSize: 128, // 2.7ms @ 48kHz
processorUrl: './processor-optimized.js'
},
balanced: {
bufferSize: 256, // 5.3ms @ 48kHz
processorUrl: './processor-balanced.js'
},
quality: {
bufferSize: 512, // 10.7ms @ 48kHz
processorUrl: './processor-quality.js'
}
};
const worklet = await createFerriteWorklet(
audioContext,
configs.balanced
);
`
`javascript`
async function monitorCPU(worklet) {
const stats = await worklet.getStats();
const cpuPercent = (stats.cpuTime / (1000/60)) * 100; // 60fps baseline
if (cpuPercent > 50) {
// Reduce processing load
await worklet.updateConfig({
enableSpectral: false,
reductionAmount: 0.3
});
}
}
| Browser | Minimum Version | Notes |
|---------|----------------|-------|
| Chrome | 66+ | Full AudioWorklet support |
| Firefox | 76+ | Full AudioWorklet support |
| Safari | 14.1+ | AudioWorklet with limitations |
| Edge | 79+ | Chromium-based versions |
| Opera | 53+ | Full AudioWorklet support |
`javascript
function isAudioWorkletSupported() {
return typeof AudioWorkletNode !== 'undefined';
}
if (isAudioWorkletSupported()) {
// Use AudioWorklet
const worklet = await createFerriteWorklet(context);
} else {
// Fall back to ScriptProcessor
console.warn('AudioWorklet not supported, using fallback');
}
`
Full TypeScript definitions included:
`typescript
import {
FerriteWorklet,
WorkletConfig,
WorkletStats
} from '@ferrite-audio/worklet';
const config: WorkletConfig = {
sampleRate: 48000,
enableGate: true,
gateThreshold: -40,
enableSpectral: true,
reductionAmount: 0.6
};
const worklet: FerriteWorklet = new FerriteWorklet(context, config);
const stats: WorkletStats = await worklet.getStats();
`
AudioWorklet not available:
`javascript`
// Check for HTTPS
if (location.protocol !== 'https:' && location.hostname !== 'localhost') {
console.error('AudioWorklet requires HTTPS');
}
WASM loading fails:
`javascript`
// Check CORS headers
const response = await fetch(wasmUrl);
if (!response.headers.get('Content-Type')?.includes('wasm')) {
console.error('Incorrect MIME type for WASM');
}
High latency:
`javascript
// Reduce buffer size
await worklet.updateConfig({ bufferSize: 128 });
// Disable expensive processing
await worklet.updateConfig({
enableSpectral: false,
wienerFilter: false
});
``
| Configuration | Latency | CPU Usage | Quality |
|--------------|---------|-----------|---------|
| Minimal (gate only) | 2.7ms | 1-2% | Good |
| Balanced | 5.3ms | 3-5% | Better |
| Maximum | 10.7ms | 5-8% | Best |
Measured on Intel i7-10700K @ 3.8GHz, Chrome 120
1. Initialize early - Create worklet during app startup
2. Reuse instances - Don't create new worklets unnecessarily
3. Monitor performance - Check stats regularly
4. Handle errors - Implement fallbacks for unsupported browsers
5. Clean up - Always dispose worklets when done
See our Contributing Guide for details.
MIT © Ferrite Audio
- 📧 Email: support@ferrite.audio
- 🐛 Issues: GitHub Issues
- 💬 Discord: Join our community
- @ferrite-audio/core - Core WASM processing engine
- @ferrite-audio/web-utils - Web Audio utilities
- @ferrite-audio/webrtc - WebRTC integration