React components and hooks for VRM AI avatars
npm install @khaveeai/react🎭 React components and hooks for intelligent VRM AI avatars with advanced animation and lip synchronization.


- 🤖 Smart VRM Avatars - 3D character rendering with Three.js/R3F
- 🎤 Real-time Voice Chat - OpenAI Realtime API integration with WebRTC
- 👄 Automatic Lip Sync - MFCC-based phoneme detection works automatically with OpenAI Realtime
- 💬 Talking Animations - Automatic gesture animations during AI speech
- 🎬 Animation System - FBX animation loading with automatic Mixamo remapping
- 😊 Facial Expressions - Smooth VRM expression control with transitions
- 👁️ Natural Blinking - Randomized blinking animations for lifelike characters
- 🔊 Audio Lip Sync - File-based lip sync analysis with MFCC and DTW algorithms
- 🛠️ Function Calling - OpenAI tool integration for RAG and custom functions
- ⚡ High Performance - Optimized rendering with automatic cleanup
``bashCore SDK
npm install @khaveeai/react @khaveeai/core
Quick Start
$3
`tsx
import { Canvas } from '@react-three/fiber';
import { KhaveeProvider, VRMAvatar } from '@khaveeai/react';function App() {
return (
);
}
`$3
`tsx
import { VRMAvatar, useVRMAnimations } from '@khaveeai/react';function AnimatedAvatar() {
const { animate } = useVRMAnimations();
// Define animations (just provide URLs!)
const animations = {
idle: '/animations/idle.fbx', // Auto-plays
dance: '/animations/dance.fbx',
wave: '/animations/wave.fbx'
};
return (
<>
src="/models/character.vrm"
animations={animations} // SDK handles everything!
/>
>
);
}
`$3
`tsx
import { VRMAvatar, useVRMExpressions } from '@khaveeai/react';function ExpressiveAvatar() {
const { setExpression } = useVRMExpressions();
return (
<>
>
);
}
`$3
`tsx
import { useAudioLipSync } from '@khaveeai/react';function LipSyncDemo() {
const { analyzeLipSync, stopLipSync, isAnalyzing, currentPhoneme } = useAudioLipSync();
return (
onClick={() => analyzeLipSync('/audio/speech.wav', {
sensitivity: 0.8,
intensityMultiplier: 3.0
})}
disabled={isAnalyzing}
>
{isAnalyzing ? 'Analyzing...' : 'Start Lip Sync'}
{currentPhoneme && (
Phoneme: {currentPhoneme.phoneme}
Intensity: {(currentPhoneme.intensity * 100).toFixed(1)}%
)}
);
}
`API Reference
$3
####
Root provider that manages VRM state and optional provider configuration.
`tsx
interface KhaveeConfig {
llm?: LLMProvider; // Optional: Chat AI provider
tts?: TTSProvider; // Optional: Text-to-speech provider
realtime?: RealtimeProvider; // Optional: Real-time voice chat provider
tools?: RealtimeTool[]; // Optional: Custom functions
} {/ config is optional /}
{children}
`####
3D VRM character component with automatic animation, lip sync, and talking animations.
`tsx
interface VRMAvatarProps {
src: string; // Path to .vrm file
position?: [number, number, number];
rotation?: [number, number, number];
scale?: [number, number, number];
animations?: AnimationConfig; // FBX animation URLs
enableBlinking?: boolean; // Enable natural blinking (default: true)
enableTalkingAnimations?: boolean; // Enable gestures during AI speech (default: true)
}
`Animation Config:
`tsx
interface AnimationConfig {
[name: string]: string; // Animation name -> FBX file URL
}// Example
const animations = {
idle: '/animations/breathing.fbx', // Auto-plays on load
walk: '/animations/walking.fbx',
dance: '/animations/dancing.fbx',
talking: '/animations/talking.fbx', // Played during AI speech
gesture1: '/animations/gesture.fbx' // Also played during speech
};
// Note: Animations with 'talk', 'gesture', or 'speak' in the name
// are automatically played randomly when chatStatus === 'speaking'
`$3
####
useRealtime()Real-time voice chat with OpenAI Realtime API.
`tsx
const {
// Connection
isConnected: boolean,
connect: () => Promise,
disconnect: () => Promise,
// Chat state
chatStatus: 'stopped' | 'ready' | 'listening' | 'speaking' | 'thinking',
conversation: Conversation[],
currentVolume: number,
isThinking: boolean,
// Lip sync (automatic with VRMAvatar)
currentPhoneme: PhonemeData | null,
startAutoLipSync: () => Promise,
stopAutoLipSync: () => void,
// Actions
sendMessage: (text: string) => Promise,
interrupt: () => void,
registerFunction: (tool: RealtimeTool) => void
} = useRealtime();
`####
useAudioLipSync()Analyze audio files for phoneme detection and lip sync.
`tsx
const {
analyzeLipSync: (audioUrl: string, options?: {
sensitivity?: number; // 0.1 to 1.0
smoothing?: number; // 0.1 to 1.0
intensityMultiplier?: number; // 1.0 to 5.0
minIntensity?: number; // 0.0 to 1.0
onPhonemeChange?: (phoneme: PhonemeData) => void;
}) => Promise,
stopLipSync: () => void,
isAnalyzing: boolean,
mouthState: MouthState, // Current mouth state
currentPhoneme: PhonemeData | null,
audioElement: HTMLAudioElement | null
} = useAudioLipSync();
`####
useVRMExpressions()Control VRM facial expressions with smooth transitions.
`tsx
const {
expressions: Record,
setExpression: (name: string, value: number) => void,
resetExpressions: () => void,
setMultipleExpressions: (expressions: Record) => void
} = useVRMExpressions();
`####
useVRMAnimations()Control VRM body animations with smooth transitions.
`tsx
const {
currentAnimation: string | null,
animate: (name: string) => void, // Play animation by name
stopAnimation: () => void, // Stop current animation
availableAnimations: string[] // List of loaded animations
} = useVRMAnimations();
`####
useVRM()Access the raw VRM model instance.
`tsx
const vrm: VRM | null = useVRM();
`####
useKhavee()Access the complete SDK context (advanced usage).
`tsx
const {
config, // Optional provider config
vrm, // VRM instance
setVrm, // Set VRM instance
expressions, // Current expressions
setExpression, // Set single expression
resetExpressions, // Reset all expressions
setMultipleExpressions, // Set multiple expressions
currentAnimation, // Current animation name
animate, // Play animation
stopAnimation, // Stop animation
availableAnimations, // Available animations
realtimeProvider // Realtime provider instance
} = useKhavee();
`Advanced Usage
$3
When using
VRMAvatar with OpenAIRealtimeProvider, lip sync happens automatically! The avatar's mouth movements are synchronized with the AI's speech using MFCC-based phoneme detection:`tsx
import { KhaveeProvider, VRMAvatar } from '@khaveeai/react';
import { OpenAIRealtimeProvider } from '@khaveeai/providers-openai-realtime';const realtime = new OpenAIRealtimeProvider({
apiKey: process.env.NEXT_PUBLIC_OPENAI_API_KEY!,
voice: 'coral',
});
function App() {
return (
);
}
`The system automatically:
- ✅ Analyzes AI voice audio in real-time
- ✅ Detects phonemes (aa, ee, ih, ou, oh)
- ✅ Applies mouth shapes to VRM expressions
- ✅ Plays talking/gesture animations randomly
- ✅ Resets mouth to neutral when AI stops speaking
$3
For pre-recorded audio files, use the
useAudioLipSync hook with advanced MFCC (Mel-Frequency Cepstral Coefficients) analysis and Dynamic Time Warping:`tsx
import { useAudioLipSync } from '@khaveeai/react';function AdvancedLipSync() {
const { analyzeLipSync, currentPhoneme, mouthState } = useAudioLipSync();
const startAnalysis = () => {
analyzeLipSync('/audio/speech.wav', {
sensitivity: 0.8, // Higher = more sensitive
intensityMultiplier: 3.0, // Boost mouth movement
minIntensity: 0.3, // Minimum threshold
onPhonemeChange: (phoneme) => {
console.log('Detected:', phoneme.phoneme, phoneme.intensity);
}
});
};
return (
{/ Real-time mouth state display /}
Mouth State:
{Object.entries(mouthState || {}).map(([viseme, value]) => (
{viseme}: {(value * 100).toFixed(1)}%
width: ${value * 100}%,
height: '20px',
backgroundColor: '#3b82f6'
}} />
))}
);
}
`$3
`tsx
import { useVRMExpressions } from '@khaveeai/react';function EmotionalPresets() {
const { setMultipleExpressions, resetExpressions } = useVRMExpressions();
const emotions = {
happy: { happy: 0.9, relaxed: 0.3 },
sad: { sad: 0.8, relaxed: 0.2 },
surprised: { surprised: 0.9, aa: 0.4 },
confused: { confused: 0.7, worried: 0.3 },
excited: { happy: 0.8, surprised: 0.6 },
};
return (
{Object.entries(emotions).map(([name, expression]) => (
key={name}
onClick={() => setMultipleExpressions(expression)}
>
{name}
))}
);
}
`$3
`tsx
import { useVRMAnimations } from '@khaveeai/react';function AnimationSequence() {
const { animate, currentAnimation } = useVRMAnimations();
const playSequence = async () => {
animate('walk');
await new Promise(resolve => setTimeout(resolve, 3000));
animate('dance');
await new Promise(resolve => setTimeout(resolve, 5000));
animate('idle');
};
return (
Current: {currentAnimation}
);
}
`Providers
$3
`tsx
import { OpenAIRealtimeProvider } from '@khaveeai/providers-openai-realtime';const provider = new OpenAIRealtimeProvider({
apiKey: string, // OpenAI API key (required)
model?: string, // Model name (default: 'gpt-4o-realtime-preview-2025-06-03')
voice?: 'alloy' | 'coral' | 'echo' | 'sage' | 'shimmer', // Voice selection
instructions?: string, // System prompt
temperature?: number, // Response randomness (0-1)
tools?: RealtimeTool[], // Custom functions for RAG, etc.
language?: string, // Response language code
turnServers?: RTCIceServer[] // Custom TURN servers for WebRTC
});
// Lip sync is automatic when used with VRMAvatar!
`$3
`tsx
import { MockLLM, MockTTS } from '@khaveeai/providers-mock';const mockConfig = {
llm: new MockLLM(), // Simulated AI responses
tts: new MockTTS(), // Simulated voice synthesis
tools: []
};
`Examples
$3
`tsx
import React from 'react';
import { Canvas } from '@react-three/fiber';
import { Environment, OrbitControls } from '@react-three/drei';
import {
KhaveeProvider,
VRMAvatar,
useRealtime,
useAudioLipSync,
useVRMExpressions,
useVRMAnimations
} from '@khaveeai/react';
import { OpenAIRealtimeProvider } from '@khaveeai/providers-openai-realtime';const realtimeProvider = new OpenAIRealtimeProvider({
apiKey: process.env.REACT_APP_OPENAI_API_KEY,
voice: 'shimmer',
instructions: 'You are a friendly AI assistant.',
});
function ControlPanel() {
const { isConnected, connect, disconnect, conversation } = useRealtime();
const { analyzeLipSync, stopLipSync, isAnalyzing, currentPhoneme } = useAudioLipSync();
const { setExpression, resetExpressions } = useVRMExpressions();
const { animate, currentAnimation } = useVRMAnimations();
return (
{/ Realtime Voice Chat /}
Voice Chat
{/ Lip Sync Controls /}
Lip Sync
onClick={() => analyzeLipSync('/audio/sample.wav')}
disabled={isAnalyzing}
>
{isAnalyzing ? 'Analyzing...' : 'Test Lip Sync'}
{currentPhoneme && (
Phoneme: {currentPhoneme.phoneme} ({currentPhoneme.intensity.toFixed(2)})
)}
{/ Expression Controls /}
Expressions
{/ Animation Controls /}
Animations
Current: {currentAnimation}
{/ Conversation History /}
Conversation
{conversation.map((msg, i) => (
{msg.role}: {msg.text}
))}
);
}function Avatar3D() {
const animations = {
idle: '/animations/breathing.fbx',
walk: '/animations/walking.fbx',
dance: '/animations/dancing.fbx'
};
return (
);
}
export default function App() {
return (
);
}
}
`TypeScript Support
Full TypeScript support with comprehensive type definitions:
`tsx
import type {
VRMAvatarProps,
AnimationConfig,
KhaveeConfig,
RealtimeProvider,
RealtimeTool,
MouthState,
PhonemeData,
ChatStatus,
Conversation
} from '@khaveeai/react';
`Performance Tips
- Frustum Culling: Disabled automatically for VRM models
- Vertex Optimization: Uses VRMUtils for performance optimization
- Smooth Transitions: Configurable lerp factors for animations (delta * 8)
- MFCC Audio Analysis: Optimized real-time frequency analysis for lip sync
- Memory Management: Automatic cleanup of audio contexts and animation mixers
Browser Support
- ✅ Chrome 88+
- ✅ Firefox 85+
- ✅ Safari 14.1+
- ✅ Edge 88+
Requirements:
- WebRTC support (for real-time features)
- Web Audio API (for lip sync analysis)
- WebGL 2.0 (for VRM rendering)
- Meyda library (for MFCC analysis)
Troubleshooting
$3
Audio Analysis Not Working:
`tsx
// Check Meyda import and audio context
const { analyzeLipSync, isAnalyzing } = useAudioLipSync();if (!isAnalyzing) {
console.log('Make sure Meyda is installed: npm install meyda');
}
`VRM Model Not Loading:
`tsx
// Check model format and path
src="/models/character.vrm" // Must be .vrm format
onLoad={() => console.log('VRM loaded successfully')}
/>
`Animations Not Playing:
`tsx
// Check FBX file paths and format
const animations = {
idle: '/animations/idle.fbx', // Must be accessible FBX files
walk: '/animations/walk.fbx'
};// Check if animations are loaded
const { availableAnimations } = useVRMAnimations();
console.log('Available animations:', availableAnimations);
`Expressions Not Working:
`tsx
// Check VRM model has expression support
const { expressions } = useVRMExpressions();
const vrm = useVRM();if (vrm?.expressionManager) {
console.log('Expression support:', Object.keys(vrm.expressionManager.expressionMap));
}
``We welcome contributions! Please see our Contributing Guide.
MIT © KhaveeAI
---
Need help? Check out our examples or open an issue.