api.video media stream composer
npm install @api.video/media-stream-composer
  

api.video is the video infrastructure for product builders. Lightning fast video APIs for integrating, scaling, and managing on-demand & low latency live streaming features in your app.
addStream(mediaStream: MediaStream | HTMLImageElement, options: StreamOptions): stringupdateStream(streamId: string, options: StreamOptions): voidremoveStream(id: string): voidgetStreams(): StreamDetails[]](#getstreams-streamdetails)getStream(id: string): StreamDetailsaddAudioSource(mediaStream: MediaStream): stringremoveAudioSource(id: string): voidgetAudioSources(): AudioSourceDetails[]](#getaudiosources-audiosourcedetails)getAudioSource(audioSourceId: string): AudioSourceDetailsmoveUp(streamId: string): voidmoveDown(streamId: string): voidstartRecording(options: RecordingOptions): voidstopRecording(): PromisegetCanvas(): HTMLCanvasElement | undefinedappendCanvasTo(containerQuerySelector: string): voidsetMouseTool(tool: "draw" | "move-resize"): voidsetDrawingSettings(settings: Partial): void clearDrawing(): voidaddEventListener(event: string, listener: Function)destroy()This library lets you easily record & upload videos to api.video from a composition of several media streams. The position and size of each stream can be set in a flexible and easy way.
This allows for example, with only a few lines of code, to create a video composed of:
"entire screen capture in the left half, window #1 capture in the right half, and the webcam in a circular shape in the bottom left of the video".
The code of a small Next.js application demonstrating the different features offered by the library is available in the examples/record.a.video folder. If you want to try it live, go here: https://record.a.video.
#### Installation method #1: requirejs
If you use requirejs you can add the library as a dependency to your project with
``sh`
$ npm install --save @api.video/media-stream-composer
You can then use the library in your script:
`javascript
var { MediaStreamComposer } = require('@api.video/media-stream-composer');
var composer = new MediaStreamComposer({
resolution: {
width: 1280,
height: 720
}
});
`
#### Installation method #2: ES6 modules
You can add the library as a dependency to your project with
`sh`
$ npm install --save @api.video/media-stream-composer
You can then use the library in your script:
`javascript
import { MediaStreamComposer } from '@api.video/media-stream-composer'
const composer = new MediaStreamComposer({
resolution: {
width: 1280,
height: 720
}
});
`
#### Simple include in a javascript project
Include the library in your HTML file like so:
`html`
...
Then you can instantiate the composer using new MediaStreamComposer():`html`
...
#### Options
The media stream composer is instantiated using an options object. At the moment, it contains only one option: resolution. If provided, this option must contain a width and a height property. This resolution will be used to create the canvas element that will be used to draw the streams. It will also be used to set the resolution of the video when it is uploaded.
If the resolution option is not provided, the canvas will be created with this resolution: 1280x720.
#### addStream(mediaStream: MediaStream | HTMLImageElement, options: StreamOptions): string
The addStream() method adds a stream to the composition. A stream can be either a MediaStream (for example, the webcam, the screen, or a window capture) or an HTMLImageElement (for example, a logo).
It takes a MediaStream | HTMLImageElement and an StreamOptions parameter.
Note regarding images origin
When you load an image onto the composition, the origin of the image must be the same as the origin of the webpage in order for the image to be displayed correctly. This means that the image must be served from the same domain, or the server hosting the image must include the appropriate CORS (Cross-Origin Resource Sharing) headers to allow the image to be displayed on the canvas from a different origin. More details here: https://developer.mozilla.org/en-US/docs/Web/HTML/CORS_enabled_image .
##### Options
| Option name | Type | Default value | Description |
| ----------: | ------------------------------------------------------------ | ------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- |
| name | string | undefined | An optional name for the stream |
| position | contain \| cover \| fixed | contain | contain: the stream will be contained in the canvas, cover: the stream will be covered by the canvas, fixed: the stream will be fixed in the canvas |fixed
| x | number | undefined | x position of the stream (in pixels, only if position = ) |fixed
| y | number | undefined | y position of the stream (in pixels, only if position = ) |fixed
| width | number | undefined | width of the stream (in pixels, only if position = ) |fixed
| height | number | undefined | height of the stream (in pixels, only if position = ) |none
| draggable | boolean | false | Whether the stream can be moved by dragging it (see bellow mouse interactions) |
| resizable | boolean | false | Whether the stream can be resized by dragging its borders (see bellow mouse interactions) |
| mask | \| circle | none | Whether the stream should be masked with a circle or not | { x: number; y: number; }
| mute | boolean | false | Whether the stream should be muted or not |
| hidden | boolean | false | Whether the stream should be hidden or not
| opacity | number | 100 | Opacity of the stream (from 0 to 100) |
| onClick | (streamId: string, event: ) => void | undefined | A callback function that will be called when the stream is clicked |
Example (screen capture)
`javascript
navigator.mediaDevices.getDisplayMedia({ video: true, audio: false }).then((stream) => {
const streamId = composer.addStream(stream, {
position: "fixed",
x: 100,
y: 100,
width: 300,
draggable: true,
resizable: true,
mask: "circle",
opacity: 90,
onClick: (streamId, event) => {
console.log(streamId, event.x, event.y);
}
});
// ...
});
`
Example (image)
`javascript
const image = new Image();
image.crossOrigin = 'anonymous';
image.src = "./my-logo.jpg";
const streamId = composer.addStream(image, {
position: "fixed",
x: 100,
y: 100,
width: 300,
draggable: true,
resizable: true,
mask: "none"
});
`
#### updateStream(streamId: string, options: StreamOptions): void
Update the options of an existing stream. It takes the id of the stream (the one returned by the addStream() method) and an StreamOptions parameter (same as for the addStream() method).
Example
`javascript`
composer.updateStream(streamId, {
hidden: true,
});
#### removeStream(id: string): void
Remove a stream from the composition. The id is the same as the one returned by the addStream() method.
Example
`javascript`
composer.removeStream(streamId);
#### getStreams(): StreamDetails[]
Returns an array of objects containing the details of all the streams in the composition.
Example
`javascript
const streams = composer.getStreams();
/*
streams: [{
"id": "0",
"options": {
"position": "fixed",
"height": 192,
"x": 0,
"y": 0,
"resizable": true,
"draggable": true,
"mask": "circle",
"name": "#0 webcam"
},
"displaySettings": {
"radius": 96,
"displayResolution": {
"width": 192,
"height": 192
},
"position": {
"x": 0,
"y": 0
},
"streamResolution": {
"width": 640,
"height": 480
},
"index": 1
},
"stream": {}
}]
*/
`
#### getStream(id: string): StreamDetails
Get the details of a stream. It takes the id of the stream. The id is the same as the one returned by the addStream() method.
Example
`javascript`
const stream = composer.getStream(streamId);
#### addAudioSource(mediaStream: MediaStream): string
The addAudioSource() method adds a stream as an audio source to the composition. It takes a MediaStream parameter. The display won't be impacted by the stream. Only the audio will be mixed.
|
Example
`javascript
navigator.mediaDevices.getDisplayMedia({ audio: { deviceId: selectedAudioSource }).then((stream) => {
const audioSourceId = composer.addAudioSource(stream);
});
`
#### removeAudioSource(id: string): void
Remove an audio source from the composition. The id is the same as the one returned by the addAudioSource() method.
Example
`javascript`
composer.removeAudioSource(audioSourceId);
#### getAudioSources(): AudioSourceDetails[]
Returns an array of objects containing the details of all the streams in the composition.
Example
`javascript
const audioSources = composer.getAudioSources();
/*
audioSources: [{
"id": "audio_0",
"stream": {}
}]
*/
`
#### getAudioSource(audioSourceId: string): AudioSourceDetails
Get the details of an audio source. It takes the id of the audio source. The id is the same as the one returned by the addAudioSource() method.
Example
`javascript`
const stream = composer.getAudioSource(audioSourceId);
#### moveUp(streamId: string): void
Move a stream up in the composition (ie. move it above the stream that was above it). The id is the same as the one returned by the addStream() method.
Example
`javascript`
composer.moveUp(streamId);
#### moveDown(streamId: string): void
Move a stream down in the composition (ie. move it below the stream that was below it). The id is the same as the one returned by the addStream() method.
Example
`javascript`
composer.moveDown(streamId);
#### startRecording(options: RecordingOptions): void
Start recording the composition & upload it to your api.video account. It takes an RecordingOptions parameter.
##### Options
Options to provide depend on the way you want to authenticate to the api.video API: either using a delegated upload token (recommanded), or using a usual access token.
###### Using a delegated upload token (recommended):
Using delegated upload tokens for authentication is best options when uploading from the client side. To know more about delegated upload token, read the dedicated article on api.video's blog: Delegated Uploads.
| Option name | Mandatory | Type | Description |
| ----------------------------: | --------- | ------ | ----------------------- |
| uploadToken | yes | string | your upload token |
| videoId | no | string | id of an existing video |
| _common options (see bellow)_ | | | |
###### Using an access token (discouraged):
Warning: be aware that exposing your access token client-side can lead to huge security issues. Use this method only if you know what you're doing :).
| Option name | Mandatory | Type | Description |
| ----------------------------: | --------- | ------ | ----------------------- |
| accessToken | yes | string | your access token |
| videoId | yes | string | id of an existing video |
| _common options (see bellow)_ | | | |
###### Common options
| Option name | Mandatory | Type | Description |
| ----------: | ------------------ | ------ | ------------------------------------------------------------------- |
| videoName | no | string | the name of your recorded video (overrides the default "file" name) |
| apiHost | no | string | api.video host (default: ws.api.video) |
| retries | no | number | number of retries when an API call fails (default: 5) |
| timeslice | no (default: 5000) | number | The number of milliseconds to record into each Blob. |
Example
`javascript`
composer.startRecording({
uploadToken: "YOUR_DELEGATED_TOKEN",
retries: 10,
});
#### stopRecording(): Promise
The stopRecording() method stops the recording of the composition. It takes no parameter. It returns a Promise that resolves with the newly created video.
Example
`javascriptplayer url: ${e.assets.player}
composer.stopRecording().then(e => console.log());`
#### getCanvas(): HTMLCanvasElement | undefined
Returns the canvas used to draw the composition. It takes no parameter.
Example
`javascript`
const canvas = composer.getCanvas();
#### appendCanvasTo(containerQuerySelector: string): void
Append the canvas used to draw the composition to an HTML container. It takes a string containing the query selector of the container.
That's useful if you want to display the composition in a web page.
Additionally, it's mandatory is you want to use mouse-based features like dragging, resizing and drawing.
Example
`html
`
#### setMouseTool(tool: "draw" | "move-resize"): void
Define the kind of action that will be performed when the user interact with the canvas using the mouse. It takes a string containing the name of the tool.
Tools:
- move-resize: move and resize a streamdraw
- : draw on a stream (drawing settings can be defined using the setDrawingSettings() method)
Example
`javascript
composer.setMouseTool("draw");
`
#### setDrawingSettings(settings: Partial
Set the drawing settings for the draw tool. It takes a DrawingSettings parameter that contains the following attributes:color
- : the color of the drawinglineWidth
- : the width of the drawingautoEraseDelay
- : the delay before the drawing is erased (in seconds - 0 means no delay)
Example
`javascript`
composer.setDrawingSettings({
color: "#FF0000",
lineWidth: 5,
autoEraseDelay: 3,
});
#### clearDrawing(): void
Clear all the drawings on the canvas. It takes no parameter.
Example
`javascript`
composer.clearDrawing();
#### addEventListener(event: string, listener: Function)
Define an event listener for the media recorder. The following events are available:
- "error": when an error occurs"recordingStopped"
- : when the recording is stopped
Example
`javascript`
composer.addEventListener("error", (event) => {
console.log(event.data);
});
#### destroy()
Destroys all streams and releases all resources in use.
This samples shows how to use the composer to create a screencast with a webcam stream in the corner.
It has the following features:
- the screencast stream is added in the background and it's size is automatically adjusted to the size of the composition (contain dimensions)
- the webcam stream is added in front of the screencast stream, in the bottom-left corner, and it's displayed in a circle - the stream is draggable and resizable
- switching between the "drawing" and "moving/resizing" tools is possible using radio buttons
`html
Video link: will be displayed when the recording is finished
``