Universal execution engine for Viji Creative scenes
@viji-dev/core)Universal execution engine for Viji Creative scenes
A powerful, secure, and feature-rich JavaScript/TypeScript library that provides the foundation for creative scene execution across all Viji platform contexts. The core offers identical IFrame + WebWorker execution with comprehensive parameter management, audio/video analysis, user interaction handling, and performance optimization.
``bash`
npm install @viji-dev/core
`typescript
import { VijiCore } from '@viji-dev/core';
// Artist scene code
const sceneCode =
// Define parameters using helper functions
const color = viji.color('#ff6b6b', {
label: 'Shape Color',
description: 'Color of the animated shape',
group: 'appearance'
});
const size = viji.slider(50, {
min: 10,
max: 150,
step: 5,
label: 'Shape Size',
description: 'Size of the animated shape',
group: 'appearance'
});
const speed = viji.slider(1.0, {
min: 0.1,
max: 3.0,
step: 0.1,
label: 'Animation Speed',
description: 'Speed of the animation',
group: 'animation'
});
// Main render function
function render(viji) {
const ctx = viji.useContext('2d');
// Clear canvas
ctx.fillStyle = '#2c3e50';
ctx.fillRect(0, 0, viji.width, viji.height);
// Animated shape
const time = viji.time * speed.value;
const x = viji.width / 2 + Math.sin(time) * 100;
const y = viji.height / 2 + Math.cos(time) * 100;
ctx.fillStyle = color.value;
ctx.beginPath();
ctx.arc(x, y, size.value / 2, 0, Math.PI * 2);
ctx.fill();
};
// Create core instance
const core = new VijiCore({
hostContainer: document.getElementById('scene-container'),
sceneCode: sceneCode,
frameRateMode: 'full',
allowUserInteraction: true
});
// Initialize and start rendering
await core.initialize();
console.log('Scene is running!');
`
The VijiCoreConfig interface defines all available configuration options:
`typescript
interface VijiCoreConfig {
// Required configuration
hostContainer: HTMLElement; // Container element for the scene
sceneCode: string; // Artist JavaScript code with render function
// Performance configuration
frameRateMode?: 'full' | 'half'; // 'full' = 60fps, 'half' = 30fps
// Input streams
audioStream?: MediaStream; // Audio input for analysis
videoStream?: MediaStream; // Video input for analysis
// Audio analysis configuration
analysisConfig?: {
fftSize?: number; // FFT size for frequency analysis (default: 2048)
smoothing?: number; // Smoothing factor 0-1 (default: 0.8)
frequencyBands?: FrequencyBand[]; // Custom frequency bands
beatDetection?: boolean; // Enable beat detection
onsetDetection?: boolean; // Enable onset detection
};
// Parameter system
parameters?: ParameterGroup[]; // Initial parameter values
// Feature toggles
noInputs?: boolean; // Disable all input processing
allowUserInteraction?: boolean; // Enable mouse/keyboard/touch events
allowDeviceInteraction?: boolean; // Enable device sensors (motion/orientation/geolocation)
}
`
#### Creation and Initialization
`typescript
// Create core instance
const core = new VijiCore({
hostContainer: document.getElementById('scene-container'),
sceneCode: sceneCode,
frameRateMode: 'full',
allowUserInteraction: true
});
// Initialize the core (required before use)
await core.initialize();
// Check if core is ready for operations
if (core.ready) {
console.log('Core is ready for use');
}
// Get current configuration
const config = core.configuration;
console.log('Current frame rate mode:', config.frameRateMode);
`
#### Performance Control
`typescript
// Frame rate control
await core.setFrameRate('full'); // Set to 60fps mode
await core.setFrameRate('half'); // Set to 30fps mode
// Resolution control
await core.setResolution(0.75); // Set to 75% of container size
await core.setResolution(0.5); // Set to 50% for performance
await core.updateResolution(); // Auto-detect container size changes
// Get performance statistics
const stats = core.getStats();
console.log('Current FPS:', stats.frameRate.effectiveRefreshRate);
console.log('Canvas size:', stats.resolution);
console.log('Scale factor:', stats.scale);
console.log('Parameter count:', stats.parameterCount);
`
#### Debug and Development
`typescript
// Enable debug logging
core.setDebugMode(true);
// Check debug mode status
const isDebugEnabled = core.getDebugMode();
// Debug mode provides detailed logging for:
// - Initialization process
// - Communication between components
// - Parameter system operations
// - Audio/video stream processing
// - Performance statistics
`
The parameter system provides a powerful way to create interactive scenes with automatic UI generation.
#### Parameter Definition and Access
`typescript
// Listen for parameter definitions from artist code
core.onParametersDefined((groups) => {
console.log('Parameters available:', groups);
// Each group contains:
// - groupName: string
// - category: 'audio' | 'video' | 'interaction' | 'general'
// - description: string
// - parameters: Record
// Generate UI based on parameter groups
generateParameterUI(groups);
});
// Set individual parameter values
await core.setParameter('color', '#ff0000');
await core.setParameter('size', 75);
await core.setParameter('enabled', true);
// Set multiple parameters efficiently
await core.setParameters({
'color': '#00ff00',
'size': 100,
'speed': 2.0,
'enabled': false
});
// Get current parameter values
const values = core.getParameterValues();
const color = core.getParameter('color');
// Listen for parameter changes
core.onParameterChange('size', (value) => {
console.log('Size parameter changed to:', value);
});
// Listen for parameter errors
core.onParameterError((error) => {
console.error('Parameter error:', error.message);
console.error('Error code:', error.code);
});
`
#### Capability-Aware Parameters
`typescript
// Get all parameter groups (unfiltered, use for saving scene parameters)
const allGroups = core.getAllParameterGroups();
// Get parameter groups filtered by active capabilities (for UI)
const visibleGroups = core.getVisibleParameterGroups();
// Check current capabilities
const capabilities = core.getCapabilities();
console.log('Audio available:', capabilities.hasAudio);
console.log('Video available:', capabilities.hasVideo);
console.log('Interaction enabled:', capabilities.hasInteraction);
// Check if specific parameter category is active
const isAudioActive = core.isCategoryActive('audio');
const isVideoActive = core.isCategoryActive('video');
// Parameters are automatically categorized:
// - 'audio': Only shown when audio stream is connected
// - 'video': Only shown when video stream is connected
// - 'interaction': Only shown when user interaction is enabled
// - 'general': Always available
`
#### Audio Stream Management
`typescript
// Set audio stream for analysis
const audioStream = await navigator.mediaDevices.getUserMedia({
audio: {
echoCancellation: false,
noiseSuppression: false,
autoGainControl: false
}
});
await core.setAudioStream(audioStream);
// Configure audio analysis with the new namespace API
// Basic control
core.audio.setSensitivity(1.5); // Increase sensitivity (0.5-2.0)
// Tap tempo for manual BPM control
core.audio.beat.tap(); // Tap to set tempo
core.audio.beat.clearTaps(); // Clear tap history
const tapCount = core.audio.beat.getTapCount();
// Beat mode control
core.audio.beat.setMode('auto'); // Automatic beat detection (default)
core.audio.beat.setMode('manual'); // Manual BPM control
core.audio.beat.setBPM(120); // Set manual BPM (60-240)
const currentBPM = core.audio.beat.getBPM();
// Beat phase control for fine-tuning
core.audio.beat.nudge(0.1); // Nudge phase forward
core.audio.beat.resetPhase(); // Reset phase to zero
// Advanced configuration (optional)
core.audio.advanced.setFFTSize(2048); // FFT resolution (higher = more accurate)
core.audio.advanced.setSmoothing(0.8); // Smoothing factor (0-1)
core.audio.advanced.setAutoGain(true); // Auto-gain normalization
core.audio.advanced.setBeatDetection(true); // Enable beat detection
core.audio.advanced.setOnsetDetection(true); // Enable onset detection
// Get current audio state
const state = core.audio.getState();
console.log('BPM:', state.currentBPM);
console.log('Confidence:', state.confidence);
console.log('Mode:', state.mode);
console.log('Is Locked:', state.isLocked);
// Get current audio stream
const currentStream = core.getAudioStream();
// Disconnect audio
await core.setAudioStream(null);
`
#### Video Stream Management
`typescript
// Set video stream for analysis
const videoStream = await navigator.mediaDevices.getUserMedia({
video: {
width: { ideal: 640 },
height: { ideal: 480 },
frameRate: { ideal: 30 }
}
});
await core.setVideoStream(videoStream);
// Video analysis includes:
// - Real-time frame processing
// - Frame data access for custom analysis
// - Brightness and motion detection
// - Custom computer vision processing
// Disconnect video
await core.setVideoStream(null);
`
#### Interaction Management
`typescript
// Enable or disable user interactions at runtime
await core.setInteractionEnabled(true); // Enable mouse, keyboard, and touch
await core.setInteractionEnabled(false); // Disable all interactions
// Get current interaction state
const isInteractionEnabled = core.getInteractionEnabled();
// Interaction state affects:
// - Mouse, keyboard, and touch event processing
// - Parameter visibility (interaction category parameters)
// - Scene behavior that depends on user input
// Note: Interaction state is separate from initialization config
// You can toggle interactions regardless of initial allowUserInteraction value
// The interaction system is always available for runtime control
`
#### Capability Change Monitoring
`typescript`
// Listen for capability changes
core.onCapabilitiesChange((capabilities) => {
console.log('Capabilities updated:', capabilities);
// Update UI based on new capabilities
if (capabilities.hasAudio) {
showAudioControls();
} else {
hideAudioControls();
}
if (capabilities.hasVideo) {
showVideoControls();
} else {
hideVideoControls();
}
if (capabilities.hasInteraction) {
showInteractionControls();
} else {
hideInteractionControls();
}
});
#### Core Lifecycle Events
`typescript
// Core is ready for operations
if (core.ready) {
// All systems initialized and running
console.log('Core is fully operational');
}
// Check if parameters are initialized
if (core.parametersReady) {
// Parameter system is ready
console.log('Parameters are available');
}
`
#### Cleanup and Resource Management
`typescript
// Destroy instance and clean up all resources
await core.destroy();
// This automatically:
// - Stops all rendering loops
// - Disconnects audio/video streams
// - Cleans up WebWorker and IFrame
// - Releases all event listeners
// - Clears parameter system
// - Frees memory resources
`
The artist API provides a comprehensive set of tools for creating interactive, audio-reactive, and video-responsive scenes.
`typescript`
function render(viji) {
// Get canvas contexts
const ctx = viji.useContext('2d'); // 2D rendering context
const gl = viji.useContext('webgl'); // WebGL rendering context
// Canvas properties
viji.canvas; // OffscreenCanvas object
viji.width; // Canvas width in pixels
viji.height; // Canvas height in pixels
viji.pixelRatio; // Device pixel ratio for crisp rendering
// Example: Draw a responsive circle
const centerX = viji.width / 2;
const centerY = viji.height / 2;
const radius = Math.min(viji.width, viji.height) * 0.1;
ctx.fillStyle = '#ff6b6b';
ctx.beginPath();
ctx.arc(centerX, centerY, radius, 0, Math.PI * 2);
ctx.fill();
}
The timing system provides FPS-independent timing data for smooth animations:
`typescript`
function render(viji) {
// Timing data (FPS independent)
viji.time; // Elapsed time in seconds since scene start
viji.deltaTime; // Time since last frame in seconds
viji.frameCount; // Total number of frames rendered
viji.fps; // Current frames per second
// Example: Smooth animation regardless of frame rate
const animationSpeed = 2.0; // rotations per second
const rotation = (viji.time animationSpeed Math.PI 2) % (Math.PI 2);
ctx.save();
ctx.translate(viji.width / 2, viji.height / 2);
ctx.rotate(rotation);
ctx.fillRect(-25, -25, 50, 50);
ctx.restore();
}
The parameter system allows artists to define interactive parameters that automatically generate UI controls.
#### Parameter Definition
`typescript
// Define parameters (call once outside render loop)
const color = viji.color('#ff6b6b', {
label: 'Primary Color',
description: 'Main color for shapes',
group: 'appearance',
category: 'general'
});
const size = viji.slider(50, {
min: 10,
max: 150,
step: 5,
label: 'Shape Size',
description: 'Size of shapes in pixels',
group: 'appearance',
category: 'general'
});
const speed = viji.slider(1.0, {
min: 0.1,
max: 3.0,
step: 0.1,
label: 'Animation Speed',
description: 'Speed of animation in rotations per second',
group: 'animation',
category: 'general'
});
const useAudio = viji.toggle(false, {
label: 'Audio Reactive',
description: 'Make shapes react to audio input',
group: 'audio',
category: 'audio'
});
const shapeType = viji.select('circle', {
options: ['circle', 'square', 'triangle', 'star'],
label: 'Shape Type',
description: 'Type of shape to draw',
group: 'appearance',
category: 'general'
});
const title = viji.text('My Scene', {
label: 'Scene Title',
description: 'Title displayed in the scene',
group: 'text',
category: 'general',
maxLength: 50
});
const particleCount = viji.number(5, {
min: 1,
max: 20,
step: 1,
label: 'Particle Count',
description: 'Number of particles to render',
group: 'animation',
category: 'general'
});
`
#### Parameter Usage in Render Loop
`typescript`
function render(viji) {
const ctx = viji.useContext('2d');
// Fast parameter access (proxy-based)
ctx.fillStyle = color.value; // Get current color value
const radius = size.value / 2; // Get current size value
const animationSpeed = speed.value; // Get current speed value
// Clear canvas
ctx.fillStyle = '#2c3e50';
ctx.fillRect(0, 0, viji.width, viji.height);
// Draw title
ctx.fillStyle = 'white';
ctx.font = '20px Arial';
ctx.textAlign = 'center';
ctx.fillText(title.value, viji.width / 2, 30);
// Draw particles
for (let i = 0; i < particleCount.value; i++) {
const angle = (i / particleCount.value) Math.PI 2 + (viji.time * animationSpeed);
const x = viji.width / 2 + Math.cos(angle) * 100;
const y = viji.height / 2 + Math.sin(angle) * 100;
ctx.fillStyle = color.value;
ctx.beginPath();
switch (shapeType.value) {
case 'circle':
ctx.arc(x, y, radius, 0, Math.PI * 2);
break;
case 'square':
ctx.rect(x - radius, y - radius, radius 2, radius 2);
break;
case 'triangle':
ctx.moveTo(x, y - radius);
ctx.lineTo(x - radius, y + radius);
ctx.lineTo(x + radius, y + radius);
ctx.closePath();
break;
}
ctx.fill();
}
}
The audio system provides real-time analysis of audio input with comprehensive frequency and volume data.
#### Audio API Overview
`typescripthsl(${hue}, 70%, 60%)
function render(viji) {
const audio = viji.audio;
if (audio.isConnected) {
// Volume analysis with smooth values
const volume = audio.volume.current; // 0-1 current volume level (RMS-based)
const peak = audio.volume.peak; // 0-1 peak amplitude (instant)
const smooth = audio.volume.smoothed; // 0-1 smoothed volume (200ms decay)
// Frequency bands (0-1 values) with instant and smooth versions
const low = audio.bands.low; // 20-150 Hz (bass/kick range, instant)
const lowSmoothed = audio.bands.lowSmoothed; // Smooth low frequency energy
const lowMid = audio.bands.lowMid; // 150-400 Hz
const mid = audio.bands.mid; // 400-2500 Hz (vocals, instruments)
const highMid = audio.bands.highMid; // 2500-8000 Hz (cymbals, hi-hats)
const high = audio.bands.high; // 8000-20000 Hz (air, brilliance)
// Automatic beat detection with BPM tracking
const beat = audio.beat;
// Fast energy curves (300ms decay - primary for most visuals)
const kickEnergy = beat.kick; // 0-1 kick drum energy
const snareEnergy = beat.snare; // 0-1 snare energy
const hatEnergy = beat.hat; // 0-1 hi-hat energy
const anyEnergy = beat.any; // 0-1 any beat type energy
// Smoothed energy curves (500ms decay - for slower animations)
const kickSmoothed = beat.kickSmoothed;
const snareSmoothed = beat.snareSmoothed;
const anySmoothed = beat.anySmoothed;
// Instant triggers (for advanced use cases)
if (beat.triggers.kick) {
// Kick drum detected this frame
spawnParticle('kick');
}
// BPM and tempo information
const bpm = beat.bpm; // Detected BPM (60-180)
const phase = beat.phase; // Beat phase 0-1 within current beat
const bar = beat.bar; // Current bar number (0-3 for 4/4)
const confidence = beat.confidence; // Detection confidence 0-1
const isLocked = beat.isLocked; // True when beat is locked
// Spectral features for advanced audio-reactive effects
const brightness = audio.spectral.brightness; // 0-1 spectral centroid
const flatness = audio.spectral.flatness; // 0-1 spectral flatness (noisiness)
const flux = audio.spectral.flux; // 0-1 spectral flux (change)
// Raw frequency data (0-255 values)
const frequencyData = audio.getFrequencyData();
// Example 1: Smooth beat-reactive animation (primary pattern)
const scale = 1 + kickEnergy * 0.5; // Smooth pulsing with kick
const hue = lowSmoothed * 180; // Smooth color based on low frequencies
ctx.save();
ctx.translate(viji.width / 2, viji.height / 2);
ctx.scale(scale, scale);
ctx.fillStyle = ;`
ctx.fillRect(-25, -25, 50, 50);
ctx.restore();
// Example 2: Phase-synced rotation
const rotation = phase Math.PI 2; // Rotate with beat phase
ctx.rotate(rotation);
}
}
#### Audio-Reactive Scene Example
`typescript
// Define audio-reactive parameters
const audioReactive = viji.toggle(true, {
label: 'Audio Reactive',
description: 'Make shapes react to audio',
group: 'audio',
category: 'audio'
});
const volumeSensitivity = viji.slider(1.0, {
min: 0.1,
max: 5.0,
step: 0.1,
label: 'Volume Sensitivity',
description: 'How sensitive shapes are to volume',
group: 'audio',
category: 'audio'
});
const bassReactivity = viji.slider(1.0, {
min: 0,
max: 3.0,
step: 0.1,
label: 'Bass Reactivity',
description: 'How much shapes react to bass',
group: 'audio',
category: 'audio'
});
function render(viji) {
const ctx = viji.useContext('2d');
const audio = viji.audio;
// Clear canvas
ctx.fillStyle = '#2c3e50';
ctx.fillRect(0, 0, viji.width, viji.height);
if (audioReactive.value && audio.isConnected) {
// Audio-reactive animation
const volume = audio.volume.current * volumeSensitivity.value;
const bass = audio.bands.low * bassReactivity.value;
// Scale based on volume
const scale = 1 + volume;
// Color based on bass
const hue = 200 + (bass * 160); // Blue to purple range
// Position based on frequency distribution
const x = viji.width * (audio.bands.mid + audio.bands.high) / 2;
const y = viji.height * (1 - audio.bands.low);
ctx.save();
ctx.translate(x, y);
ctx.scale(scale, scale);
ctx.fillStyle = hsl(${hue}, 80%, 60%);`
ctx.beginPath();
ctx.arc(0, 0, 30, 0, Math.PI * 2);
ctx.fill();
ctx.restore();
}
}
The video system provides real-time video frame analysis with frame data access for custom processing.
#### Video API Overview
`typescriptrgba(255, 255, 255, ${brightness * 0.5})
function render(viji) {
const video = viji.video;
if (video.isConnected) {
// Video properties
const frameWidth = video.frameWidth;
const frameHeight = video.frameHeight;
const frameRate = video.frameRate;
// Current video frame (OffscreenCanvas)
if (video.currentFrame) {
// Draw video frame as background
ctx.globalAlpha = 0.3;
ctx.drawImage(video.currentFrame, 0, 0, viji.width, viji.height);
ctx.globalAlpha = 1.0;
}
// Frame data for custom analysis
const frameData = video.getFrameData();
// Example: Custom video analysis
if (frameData) {
// Access raw pixel data for custom processing
const imageData = frameData.data;
const width = frameData.width;
const height = frameData.height;
// Example: Calculate average brightness
let totalBrightness = 0;
for (let i = 0; i < imageData.length; i += 4) {
const r = imageData[i];
const g = imageData[i + 1];
const b = imageData[i + 2];
totalBrightness += (r + g + b) / 3;
}
const averageBrightness = totalBrightness / (imageData.length / 4);
// Use brightness for effects
const brightness = averageBrightness / 255; // Normalize to 0-1
// Create brightness-reactive animation
ctx.fillStyle = ;`
ctx.fillRect(0, 0, viji.width, viji.height);
}
}
}
#### Video-Reactive Scene Example
`typescript
// Define video-reactive parameters
const videoReactive = viji.toggle(true, {
label: 'Video Reactive',
description: 'Make shapes react to video',
group: 'video',
category: 'video'
});
const motionSensitivity = viji.slider(1.0, {
min: 0.1,
max: 3.0,
step: 0.1,
label: 'Motion Sensitivity',
description: 'How sensitive shapes are to video changes',
group: 'video',
category: 'video'
});
function render(viji) {
const ctx = viji.useContext('2d');
const video = viji.video;
if (videoReactive.value && video.isConnected) {
// Video-reactive animation using frame data
const frameData = video.getFrameData();
if (frameData) {
// Simple motion detection (compare with previous frame)
// This is a basic example - you can implement more sophisticated analysis
const imageData = frameData.data;
let motionEnergy = 0;
// Calculate motion energy (simplified)
for (let i = 0; i < imageData.length; i += 4) {
const brightness = (imageData[i] + imageData[i + 1] + imageData[i + 2]) / 3;
motionEnergy += brightness;
}
const normalizedMotion = (motionEnergy / (imageData.length / 4)) / 255;
const scale = 1 + (normalizedMotion * motionSensitivity.value);
// Create motion-reactive shapes
ctx.save();
ctx.translate(viji.width / 2, viji.height / 2);
ctx.scale(scale, scale);
ctx.fillStyle = hsl(${normalizedMotion * 360}, 70%, 60%);`
ctx.beginPath();
ctx.arc(0, 0, 30, 0, Math.PI * 2);
ctx.fill();
ctx.restore();
}
}
}
The interaction system provides comprehensive support for mouse, keyboard, and touch input.
#### Mouse Interaction
`typescript`
function render(viji) {
const mouse = viji.mouse;
// Mouse position (canvas coordinates)
if (mouse.isInCanvas) {
const x = mouse.x; // Current X coordinate
const y = mouse.y; // Current Y coordinate
// Mouse movement
const deltaX = mouse.deltaX; // X movement since last frame
const deltaY = mouse.deltaY; // Y movement since last frame
const velocity = mouse.velocity; // Smoothed velocity { x, y }
// Mouse buttons
const isPressed = mouse.isPressed; // Any button currently pressed
const leftButton = mouse.leftButton; // Left button state
const rightButton = mouse.rightButton; // Right button state
const middleButton = mouse.middleButton; // Middle button state
// Frame-based events
const wasPressed = mouse.wasPressed; // Button was pressed this frame
const wasReleased = mouse.wasReleased; // Button was released this frame
const wasMoved = mouse.wasMoved; // Mouse moved this frame
// Scroll wheel
const wheelDelta = mouse.wheelDelta; // Combined wheel delta
const wheelX = mouse.wheelX; // Horizontal wheel delta
const wheelY = mouse.wheelY; // Vertical wheel delta
// Example: Mouse-reactive animation
ctx.fillStyle = leftButton ? 'red' : 'blue';
ctx.beginPath();
ctx.arc(x, y, 20 + Math.abs(velocity.x + velocity.y), 0, Math.PI * 2);
ctx.fill();
}
}
#### Keyboard Interaction
`typescript`
function render(viji) {
const keyboard = viji.keyboard;
// Key state queries
if (keyboard.isPressed('w')) {
// W key is currently pressed
console.log('W key is held down');
}
if (keyboard.wasPressed('space')) {
// Space was pressed this frame
console.log('Space was pressed!');
}
if (keyboard.wasReleased('escape')) {
// Escape was released this frame
console.log('Escape was released!');
}
// Active key tracking
const activeKeys = keyboard.activeKeys; // Set of currently pressed keys
const pressedThisFrame = keyboard.pressedThisFrame; // Set of keys pressed this frame
const releasedThisFrame = keyboard.releasedThisFrame; // Set of keys released this frame
// Modifier keys
const shift = keyboard.shift; // Shift key is held
const ctrl = keyboard.ctrl; // Ctrl key is held
const alt = keyboard.alt; // Alt key is held
const meta = keyboard.meta; // Meta/Cmd key is held
// Recent activity
const lastKeyPressed = keyboard.lastKeyPressed; // Last key that was pressed
const lastKeyReleased = keyboard.lastKeyReleased; // Last key that was released
// Example: Keyboard-controlled movement
let moveX = 0;
let moveY = 0;
if (keyboard.isPressed('w') || keyboard.isPressed('W')) moveY -= 5;
if (keyboard.isPressed('s') || keyboard.isPressed('S')) moveY += 5;
if (keyboard.isPressed('a') || keyboard.isPressed('A')) moveX -= 5;
if (keyboard.isPressed('d') || keyboard.isPressed('D')) moveX += 5;
// Apply movement
ctx.save();
ctx.translate(moveX, moveY);
ctx.fillStyle = 'green';
ctx.fillRect(0, 0, 50, 50);
ctx.restore();
}
#### Touch Interaction
`typescript`
function render(viji) {
const touches = viji.touches;
// Touch points
for (const touch of touches.points) {
const x = touch.x; // Touch X coordinate
const y = touch.y; // Touch Y coordinate
const pressure = touch.pressure; // Pressure (0-1)
const radius = touch.radius; // Touch radius
const id = touch.id; // Unique touch ID
// Movement
const deltaX = touch.deltaX; // X movement since last frame
const deltaY = touch.deltaY; // Y movement since last frame
const velocity = touch.velocity; // Movement velocity { x, y }
// Lifecycle
const isNew = touch.isNew; // Touch started this frame
const isActive = touch.isActive; // Touch is currently active
const isEnding = touch.isEnding; // Touch ending this frame
// Draw touch point
ctx.fillStyle = isNew ? 'red' : isEnding ? 'yellow' : 'blue';
ctx.beginPath();
ctx.arc(x, y, radius 2, 0, Math.PI 2);
ctx.fill();
}
// Touch events
const started = touches.started; // Touches that started this frame
const moved = touches.moved; // Touches that moved this frame
const ended = touches.ended; // Touches that ended this frame
// Primary touch (first touch point)
const primary = touches.primary; // Primary touch point or null
// Touch gestures
const gestures = touches.gestures;
if (gestures.isPinching) {
const scale = gestures.pinchScale; // Current pinch scale
const delta = gestures.pinchDelta; // Scale change since last frame
// React to pinch gesture
ctx.save();
ctx.scale(scale, scale);
ctx.fillStyle = 'purple';
ctx.fillRect(0, 0, 100, 100);
ctx.restore();
}
if (gestures.isRotating) {
const angle = gestures.rotationAngle; // Current rotation angle
const delta = gestures.rotationDelta; // Rotation change since last frame
// React to rotation gesture
ctx.save();
ctx.rotate(angle);
ctx.fillStyle = 'orange';
ctx.fillRect(-25, -25, 50, 50);
ctx.restore();
}
if (gestures.isPanning) {
const panDelta = gestures.panDelta; // Pan movement { x, y }
// React to pan gesture
ctx.save();
ctx.translate(panDelta.x, panDelta.y);
ctx.fillStyle = 'cyan';
ctx.fillRect(0, 0, 50, 50);
ctx.restore();
}
if (gestures.isTapping) {
const tapCount = gestures.tapCount; // Number of taps
const tapPosition = gestures.tapPosition; // { x, y } tap position
// React to tap gesture
if (tapPosition) {
ctx.fillStyle = 'lime';
ctx.beginPath();
ctx.arc(tapPosition.x, tapPosition.y, 30, 0, Math.PI * 2);
ctx.fill();
}
}
}
#### Device Sensors (Motion, Orientation, Geolocation)
`typescriptLocation: ${lat.toFixed(4)}, ${lon.toFixed(4)}
function render(viji) {
// Internal device (current device running the scene)
const device = viji.device;
// Check if device motion is available
if (device.motion?.acceleration) {
const accelX = device.motion.acceleration.x; // m/sยฒ (without gravity)
// Example: Shake detection
const magnitude = Math.sqrt(
accelX2 + device.motion.acceleration.y2 + device.motion.acceleration.z**2
);
if (magnitude > 15) {
console.log('Device shaken!');
}
}
// Check if device orientation is available
if (device.orientation.gamma !== null) {
const tiltLR = device.orientation.gamma; // -90 to 90 (left/right tilt)
const tiltFB = device.orientation.beta; // -180 to 180 (front/back tilt)
// Example: Tilt-based control
const moveX = (tiltLR / 90) * 5;
ctx.save();
ctx.translate(moveX, 0);
ctx.fillStyle = 'red';
ctx.fillRect(viji.width/2 - 25, viji.height/2 - 25, 50, 50);
ctx.restore();
}
// Check if geolocation is available
if (device.geolocation.latitude !== null) {
const lat = device.geolocation.latitude;
const lon = device.geolocation.longitude;
ctx.fillStyle = 'white';
ctx.fillText(, 10, 30);hsl(${index * 60}, 70%, 60%)
}
// External connected devices (WebRTC/Sockets)
viji.devices.forEach((device, index) => {
if (device.orientation.gamma !== null) {
// Multi-device control example
const x = (device.orientation.gamma / 90 + 1) * viji.width / 2;
ctx.fillStyle = ;`
ctx.beginPath();
ctx.arc(x, 100 + index 60, 25, 0, Math.PI 2);
ctx.fill();
}
});
}
Device Sensor Features:
- Motion Sensors: Accelerometer and gyroscope data for shake detection and motion tracking
- Orientation Sensors: Device tilt for tilt-based controls and compass heading
- Geolocation: GPS coordinates for location-based content
- Multi-Device Support: Connect multiple external devices via WebRTC for installations
- Graceful Degradation: Automatic handling of missing sensors or permissions
See docs/20-device-sensor-system.md for complete device sensor documentation including WebRTC integration and multi-device installations.
Viji Core supports P5.js as an optional rendering library. P5.js provides familiar creative coding APIs while maintaining all Viji features including audio reactivity, video processing, and parameter management.
Add a single comment at the top of your scene code:
`javascript
// @renderer p5
function setup(viji, p5) {
p5.colorMode(p5.HSB);
}
function render(viji, p5) {
p5.background(220);
p5.fill(255, 0, 0);
p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
}
`
- โ
All P5.js drawing functions (shapes, colors, transforms, typography)
- โ
P5.js math utilities (noise(), random(), map(), lerp())p5.Vector
- โ
P5.js vectors ( class)p5.WEBGL
- โ
WebGL mode ()
- โ
Full Viji integration (audio, video, parameters, interaction)
- โ p5.dom (use Viji parameters instead)
- โ p5.sound (use viji.audio instead)viji.mouse
- โ P5.js events (use /keyboard/touches instead)
- โ Direct image loading (use Viji image parameters instead)
`javascript
// @renderer p5
const audioReactive = viji.toggle(true, {
label: 'Audio Reactive',
category: 'audio'
});
const bassReactivity = viji.slider(1.0, {
min: 0,
max: 3.0,
label: 'Bass Reactivity',
category: 'audio'
});
function render(viji, p5) {
p5.background(0);
if (audioReactive.value && viji.audio.isConnected) {
const bass = viji.audio.bands.low;
const volume = viji.audio.volume.current;
const hue = bass 360 bassReactivity.value;
const size = 100 + volume * 200;
p5.colorMode(p5.HSB);
p5.fill(hue, 80, 100);
p5.ellipse(p5.width / 2, p5.height / 2, size, size);
}
}
`
`javascript
// @renderer p5
const bgImage = viji.image(null, {
label: 'Background Image',
group: 'media',
accept: 'image/*'
});
function render(viji, p5) {
p5.background(220);
if (bgImage.value) {
p5.image(bgImage.value, 0, 0, p5.width, p5.height);
}
// Draw on top of image
p5.fill(255, 0, 0, 128);
p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
}
`
`typescript
const core = new VijiCore({
hostContainer: container,
sceneCode: sceneCode
});
await core.initialize();
// Check which renderer is being used (from stats, like FPS/resolution)
const stats = core.getStats();
const rendererType = stats.rendererType; // 'native' | 'p5'
`
`typescript`
core.onParametersDefined((groups) => {
groups.forEach(group => {
Object.entries(group.parameters).forEach(([name, def]) => {
if (def.type === 'image') {
// Create file picker
const input = createFileInput(name, def);
input.addEventListener('change', async (e) => {
const file = e.target.files[0];
if (file) {
await core.setParameter(name, file); // Unified API handles images automatically
}
});
}
});
});
});
See docs/16-p5js-integration.md for comprehensive documentation including migration guides, troubleshooting, and advanced examples.
The core implements a comprehensive security model to ensure safe execution of artist code:
- IFrame Isolation: Complete separation from host environment with sandboxed execution
- WebWorker Sandboxing: Artist code runs with controlled API access only
- Blob URL Creation: Secure worker and IFrame creation from blob URLs
- Resource Protection: Memory leaks and errors cannot affect main application
- Controlled Communication: Optimized message passing with validation
The core provides extensive performance optimization capabilities:
- Configurable Frame Rates: Full (60fps) or half (30fps) modes for performance tuning
- Resolution Scaling: Fractional (0.1-1.0) or explicit canvas dimensions
- Adaptive Optimization: Automatic performance tuning based on hardware capabilities
- Efficient Communication: Optimized message passing between components
- Memory Management: Automatic resource cleanup and memory leak prevention
The core supports multiple concurrent instances for complex applications:
`typescript
// Main scene with full features
const mainCore = new VijiCore({
hostContainer: document.getElementById('main-scene'),
resolution: { width: 1920, height: 1080 },
frameRateMode: 'full',
allowUserInteraction: true,
audioStream: sharedAudioStream,
videoStream: sharedVideoStream
});
// Preview instance with reduced features
const previewCore = new VijiCore({
hostContainer: document.getElementById('preview'),
resolution: 0.25, // 25% resolution for performance
frameRateMode: 'half',
noInputs: true,
allowUserInteraction: false,
audioStream: sharedAudioStream // Shared efficiently across instances
});
// Thumbnail instance for gallery view
const thumbnailCore = new VijiCore({
hostContainer: document.getElementById('thumbnail'),
resolution: 0.1, // 10% resolution
frameRateMode: 'half',
noInputs: true,
allowUserInteraction: false
});
// To change scenes, create a new instance and destroy the old one
const newCore = new VijiCore({
hostContainer: document.getElementById('scene-host'),
sceneCode: newSceneCode,
audioStream: sharedAudioStream,
videoStream: sharedVideoStream
});
// Automatic comprehensive cleanup when destroyed
await oldCore.destroy();
`
The core provides comprehensive error handling with detailed error information:
`typescript
import { VijiCoreError } from '@viji-dev/core';
try {
const core = new VijiCore(config);
await core.initialize();
} catch (error) {
if (error instanceof VijiCoreError) {
console.error(Core error [${error.code}]:, error.message);`
console.error('Error context:', error.context);
// Handle specific error types
switch (error.code) {
case 'INVALID_CONFIG':
console.error('Configuration is invalid:', error.context);
break;
case 'INITIALIZATION_ERROR':
console.error('Failed to initialize core:', error.context);
break;
case 'CORE_NOT_READY':
console.error('Core is not ready for operations');
break;
case 'INSTANCE_DESTROYED':
console.error('Core instance has been destroyed');
break;
case 'PARAMETERS_NOT_INITIALIZED':
console.error('Parameter system not yet initialized');
break;
case 'UNKNOWN_PARAMETER':
console.error('Parameter not found:', error.context);
break;
}
} else {
console.error('Unexpected error:', error);
}
}
Common Error Codes:
- INVALID_CONFIG - Configuration validation failedINITIALIZATION_ERROR
- - Failed to initialize core componentsCORE_NOT_READY
- - Operation attempted before readyINSTANCE_DESTROYED
- - Operation attempted after destroyPARAMETERS_NOT_INITIALIZED
- - Parameters not yet availableUNKNOWN_PARAMETER
- - Parameter not foundCONCURRENT_INITIALIZATION
- - Multiple initialization attemptsMANAGER_NOT_READY
- - Internal component not available
`bashInstall dependencies
npm install
๐ Examples
The package includes comprehensive examples in the
/example` directory:- Basic Scene Creation: Simple animated shapes with parameters
- Audio-Reactive Scenes: Scenes that respond to audio input
- Video-Reactive Scenes: Scenes that respond to video analysis
- Interactive Scenes: Mouse, keyboard, and touch interaction
- Parameter System: Complete parameter definition and UI generation
- Multi-Instance: Multiple concurrent scene instances
Copyright (c) 2025 Artem Verkhovskiy and Dmitry Manoilenko.
All rights reserved - see the LICENSE file for details.
By contributing, you agree to the CLA.
Please also confirm your agreement by filling out this short form.
---
Viji Core - Universal execution engine for creative scenes across all contexts.