SDK for Quibble AI
npm install quibbleaiTo install the QuibbleAI SDK, run the following command:
``bash`
npm i quibbleai
Import the necessary modules from the SDK using the following syntax:
`javascript`
import { Quib, Transcriber, useVAD } from 'quibbleai';
For a Next.js project, add the required plugins to your next.config.js:
`javascript
/* @type {import('next').NextConfig} /
const CopyPlugin = require("copy-webpack-plugin");
const wasmPaths = [
"./node_modules/onnxruntime-web/dist/*.wasm",
"./node_modules/@ricky0123/vad-web/dist/silero_vad.onnx",
"./node_modules/@ricky0123/vad-web/dist/vad.worklet.bundle.min.js"
];
const nextConfig = {
webpack(config) {
config.module.rules.push({
test: /\.svg$/,
use: ["@svgr/webpack"],
});
config.resolve.alias = {
...config.resolve.alias,
sharp$: false,
"onnxruntime-node$": false,
};
config.plugins.push(
new CopyPlugin({
patterns: wasmPaths.map((p) => ({
from: p,
to: "static/chunks/app",
})),
})
);
// for Vercel
config.plugins.push(
new CopyPlugin({
patterns: wasmPaths.map((p) => ({
from: p,
to: "static/chunks",
})),
})
);
return config;
},
reactStrictMode: false,
async headers() {
return [
{
source: "/_next/(.*)",
headers: [
{
key: "Cross-Origin-Opener-Policy",
value: "require-corp",
},
{
key: "Cross-Origin-Embedder-Policy",
value: "require-corp",
},
],
},
];
},
};
module.exports = nextConfig;
`
For a Vite React app, add the required plugins to your vite.config.js:
`javascript
import react from "@vitejs/plugin-react-swc";
import { viteStaticCopy } from "vite-plugin-static-copy";
export default defineConfig({
plugins: [
react(),
viteStaticCopy({
targets: [
{
src: "node_modules/@ricky0123/vad-web/dist/vad.worklet.bundle.min.js",
dest: "./",
},
{
src: "node_modules/@ricky0123/vad-web/dist/silero_vad.onnx",
dest: "./",
},
{
src: "node_modules/onnxruntime-web/dist/*.wasm",
dest: "./",
},
],
}),
],
});
`
The Quib class is a middleware that connects to the QuibbleAI websocket to handle various events such as user input, media output, end call, clear, and more.
`javascript`
import { Quib } from "quibbleai";
To create a Quib object, provide the following parameters:
- protocol: Accepted values are ws for development and wss for production.host
- : The host URL provided by QuibbleAI.uid
- : A unique string value provided by the user for each connection.
`javascript`
const quib = new Quib({
protocol: "wss",
host: process.env.HOST,
uid: "unique_connection_ID"
});
- connected: Triggered when a connection to QuibbleAI's websocket is established.
`javascript`
quib.on("connected", () => {
console.log("Connection to websocket established");
});
- media: Provides mp3 buffer data for audio playback. This event may occur multiple times for a single audio file.
`javascript`
quib.on("media", (mediaPayload) => {
console.log(mediaPayload?.media);
});
- mark: Indicates that all media events for the current interaction have been sent.
`javascript`
quib.on("mark", () => {
console.log("Audio data for the current interaction completely received");
});
- clear: Indicates that the server has detected an interruption and wants you to clear the currently playing media up to the mark event.
`javascript`
quib.on("clear", () => {
console.log("Clear the current playing audio up to the mark event");
});
- userText: Provides the user text input sent to the LLM.
`javascript`
quib.on("userText", (data) => {
console.log(data);
});
- assistantText: Provides the text response from the LLM.
`javascript`
quib.on("assistantText", (data) => {
console.log(data);
});
- endCall: Indicates the end of the conversation from the agent's side.
`javascript`
quib.on("endCall", () => {
console.log("Conversation end reached from agent side");
});
- close: Indicates that the websocket connection has been closed or is in the process of closing.
`javascript`
quib.on("close", () => {
console.log("Websocket has been closed or is in closing state");
});
- error: Triggered when there is an error with the websocket connection.
`javascript`
quib.on("error", (error) => {
console.log(error);
});
- gpt(text): Sends text input to the LLM.
`javascript`
quib.gpt("Hello! How are you?");
- stop(): Initiates the closing of the websocket on your end.
`javascript`
quib.stop();
- close(): Closes the websocket connection.
`javascript`
quib.close();
- mark(): Indicates that you have finished playing the audio data for this interaction.
`javascript`
quib.mark();
- interruption(text): Notifies the websocket of an interruption.
`javascript`
quib.interruption("Pardon! I can't hear you.");
- keepAlive(): Keeps the connection alive during inactivity. Recommended to be sent every 5 seconds.
`javascript`
quib.keepAlive();
The Transcriber class is a middleware that connects to the Deepgram websocket for audio transcription from your microphone.
`javascript`
import { Transcriber } from "quibbleai";
To create a Transcriber object, provide the following parameter:
- apiKey: Your Deepgram API Key (provided by QuibbleAI).
`javascript`
const transcriber = new Transcriber({ apiKey: process.env.DEEPGRAM_API_KEY });
- connected: Triggered when a connection to the Deepgram websocket is established.
`javascript`
transcriber.on("connected", () => {
console.log("Connection to Deepgram websocket established");
});
- error: Triggered when there is an error with the Deepgram websocket connection.
`javascript`
transcriber.on("error", (error) => {
console.log(error);
});
- close: Indicates that the websocket connection has been closed or is in the process of closing.
`javascript`
transcriber.on("close", () => {
console.log("Websocket has been closed or is in closing state");
});
- partialTranscription: Provides a partial transcription which may not be the final transcription.
`javascript`
transcriber.on("partialTranscription", (text) => {
console.log(text);
});
- transcription: Provides the final transcription.
`javascript`
transcriber.on("transcription", (text) => {
console.log(text);
});
- send(): Sends audio blob data to Deepgram.
`javascript`
transcriber.send();
- keepAlive(): Keeps the connection with the Deepgram websocket alive during inactivity. Recommended to be sent every 9 seconds.
`javascript`
transcriber.keepAlive();
- close(): Closes the connection to Deepgram.
`javascript`
transcriber.close();
The useVad hook is used for voice activity detection (VAD) with a media device object.
`javascript`
import { useVad } from "quibbleai";
First, define the mediaDevice:
`javascript`
const uMedia = await navigator.mediaDevices.getUserMedia({
audio: {
noiseSuppression: true,
echoCancellation: true,
},
});
Then, define the onSpeechStart and onSpeechEnd functions:
`javascript
const onSpeechStartFunction = () => {
console.log("VAD detected start of speech");
};
const onSpeechEndFunction = () => {
console.log("VAD detected end of speech");
};
`
Now, define the useVad hook:
`javascript`
const { start, pause } = useVad({
userMedia: uMedia,
onSpeechStart: onSpeechStartFunction,
onSpeechEnd: onSpeechEndFunction,
});
The useVad hook returns two functions: start and pause.
- start(): Starts VAD processing of the audio.
`javascript`
start();
- pause(): Pauses VAD processing of the audio.
`javascript``
pause();