This is a library for declarative use of Web Audio API with Angular
npm install @ng-web-apis/audio


This is a library for declarative use of Web Audio API
with Angular 7+. It is a complete conversion to declarative Angular directives, if you find any inconsistencies or
errors, please file an issue. Watch out for š” emoji in this
README for additional features and special use cases.
> After you installed the package, you can add @ng-web-apis/audio/polyfills to your polyfills.ts. It helps to
> normalize things like webkitAudioContext, otherwise your code might fail.
You can build audio graph with directives. For example, here's a typical echo feedback loop:
``html`
src="/demo.wav"
waMediaElementAudioSourceNode
>
waDelayNode
[delayTime]="delayTime"
>
[gain]="gain"
>
This library has WaAudioBufferService with fetch method, returning
Promise which allows you to
easily turn your hosted audio file into AudioBuffer
through GET requests. Result is stored in service's cache so same file is not requested again while application is
running.
This service is also used within directives that have
AudioBuffer inputs (such as
AudioBufferSourceNode or
ConvolverNode) so you can just pass string URL, as
well as an actual AudioBuffer. For example:
`html`
#source="AudioNode"
buffer="/demo.wav"
waAudioBufferSourceNode
(click)="source.start()"
>
Play
You can use following audio nodes through directives of the same name (prefixed with wa standing for Web API):
š” Not required if you only need one, global context will be created when needed
š” Also gives you access to AudioListener parameters
such as positionX
š” Additionally supports empty autoplay attribute similar to audio tag so it would start rendering immediately
š” Also gives you access to AudioListener parameters
such as positionX
š” Use it to terminate branch of your graph
š” can be used multiple times inside single
BaseAudioContext referencing the same
BaseAudioContext.destination
š” Has (quiet) output to watch for particular graph branch going _almost_ silent for 5 seconds straight so you can
remove branch after all effects played out to silence to free up resources
- MediaStreamAudioDestinationNode
š” Additionally supports setting URL to media file as
buffer so it will be fetched and
turned into AudioBuffer
š” Additionally supports empty autoplay attribute similar to audio tag so it would start playing immediately
š” Additionally supports empty autoplay attribute similar to audio tag so it would start playing immediately
- MediaElementAudioSourceNode
- MediaStreamAudioSourceNode
- OscillatorNode
š” Additionally supports empty autoplay attribute similar to audio tag so it would start playing immediately
- BiquadFilterNode
- ChannelMergerNode
š” Use Channel directive to merge channels, see example in Special cases section
- ChannelSplitterNode
- ConvolverNode
š” Additionally supports setting URL to media file as
buffer so it will be fetched and turned into
AudioBuffer
- DelayNode
- GainNode
- IIRFilterNode
- PannerNode
- ScriptProcessorNode
- StereoPannerNode
- WaveShaperNode
You can use AudioWorkletNode in supporting
browsers. To register your
AudioWorkletProcessors in a global default
AudioContext you can use tokens:
`ts`
@NgModule({
bootstrap: [AppComponent],
declarations: [AppComponent],
providers: [
{
provide: WA_AUDIO_WORKLET_PROCESSORS,
useValue: 'assets/my-processor.js',
multi: true,
},
],
})
export class AppModule {}
`ts
@Component({
selector: 'app',
templateUrl: './app.component.html',
})
export class App {
constructor(@Inject(WA_AUDIO_WORKLET_PROCESSORS_READY) readonly processorsReady: Promise
// ...
}
`
You can then instantiate your AudioWorkletNode:
`html`
waAudioWorkletNode
name="my-processor"
>
If you need to create your own node with custom
AudioParam and control it declaratively you can extend
WaWorklet class and add audioParam decorator to new component's inputs:
`ts`
@Directive({
selector: '[my-worklet-node]',
inputs: ['paramSetter: param'],
exportAs: 'AudioNode',
providers: [asAudioNode(MyWorklet)],
})
export class MyWorklet extends WaWorklet {
set paramSetter(value: AudioParamInput) {
audioParam(this.param, value, this.context.currentTime);
}
}
Since work with AudioParam is imperative in its nature,
there are difference to native API when working with declarative inputs and directives.
> NOTE: You can always access directives through
> template reference variables /
> @ViewChild and since they extend native nodes work with
> AudioParam in traditional Web Audio fashion
AudioParam inputs for directives accept following
arguments:
- number to set in instantly, equivalent to settingAudioParamCurve
AudioParam.value
- to set array of values over given duration, equivalent to
AudioParam.setValueCurveAtTime
called with AudioContext.currentTime
export type AudioParamCurve = {
readonly value: number[];
readonly duration: number;
}
- AudioParamAutomation to linearly or exponentially ramp to given value starting from
AudioContext.currentTime
export type AudioParamAutomation = {
readonly value: number;
readonly duration: number;
readonly mode: 'instant' | 'linear' | 'exponential';
};
- AudioParamAutomation[] to schedule multiple changes in value, stacking one after another
You can use waAudioParam pipe to turn your number values into AudioParamAutomation (default mode is exponential,AudioParamCurve
so last argument can be omitted) or number arrays to (second argument duration is in seconds):
`html`
gain="0"
[gain]="gain | waAudioParam : 0.1 : 'linear'"
>
This way values would change smoothly rather than abruptly, causing audio artifacts.
NOTE: You can set initial value for AudioParam
through argument binding combined with dynamic property binding as seen above.
To schedule an audio envelope looking something like this:
You would need to pass the following array of AudioParamAutomation items:
`ts`
const envelope = [
{
value: 0,
duration: 0,
mode: 'instant',
},
{
value: 1,
duration: ATTACK_TIME,
mode: 'linear',
},
{
value: SUS,
duration: DECAY_TIME,
mode: 'linear',
},
{
value: SUS,
duration: SUSTAIN_TIME,
mode: 'instant',
},
{
value: 0,
duration: RELEASE_TIME,
mode: 'exponential',
},
];
- Use waOutput directive when you need non-linear graph (see feedback loop example above) or to manually connectwaPeriodicWave
AudioNode to
AudioNode or
AudioParam
- Use pipe to create PeriodicWave forAudioNode
OscillatorNode
- All node directives are exported as so you can use them withwaChannel
template reference variables (see feedback loop example above)
- Use directive withinwaOutput
ChannelMergerNode and direct
directive to it in order to perform channel merging:
`html`
src="/demo.wav"
waMediaElementAudioSourceNode
>
waChannel
>
waChannel
>
- You can check Web Audio API support in current
browser by injecting WA_WEB_AUDIO_SUPPORT tokenWA_AUDIO_CONTEXT
- You can inject BaseAudioContext through
tokenWA_FEEDBACK_COEFFICIENTS
- AudioContext is created by default with default
options when token is requested
- You can also provide custom BaseAudioContext
through that token
- Provide and WA_FEEDFORWARD_COEFFICIENTS tokens to be able to createWA_MEDIA_STREAM
IIRFilterNode
- Provide token to be able to createWA_AUDIO_NODE
MediaStreamAudioSourceNode
- All node directives provide underlying AudioNode as
tokenWA_AUDIO_WORKLET_PROCESSORS
- Use token to declare array ofWA_AUDIO_WORKLET_PROCESSORS_READY
AudioWorkletProcessors to be added to
default AudioContext
- Inject token to initialize provided
AudioWorkletProcessors loading and watch for
Promise resolution before
instantiating dependent AudioWorkletNodes
|
|
|
|
|
| :-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: |
| 79+ | 76+ | 66+ | 14+ |
> Note that some features (AudioWorklet etc.) were
> added later and are supported only by more recent versions
_IMPORTANT: You must add @ng-web-apis/audio/polyfill to your polyfills.ts, otherwise you will getReferenceError: X is not defined in browsers for entities they do not support_
š” StereoPannerNode is emulated with
PannerNode in browsers that do not support it yet
š” positionX
(orientationX) and other similar
properties of AudioListener and
PannerNode fall back to
setPosition
(setOrientation) method if browser does
not support it
If you want to use this package with SSR, you need to mock native Web Audio API classes on the server:
`ts`
import '@ng-web-apis/audio/polyfill';
> It is recommended to keep the import statement at the top of your server.ts or main.server.ts` file.
You can try online demo here
Other Web APIs for Angular by
@ng-web-apis