Vision Camera Frame Processor plugin for real-time pixel color analysis - extract dominant colors, brightness, ROI, motion detection
npm install react-native-camera-vision-pixel-colorsHigh-performance Vision Camera Frame Processor for React Native (Expo compatible) that analyzes pixel colors in real time.
This plugin extracts:
- Up to 10 most frequent colors (RGB + optional HSV) (configurable)
- Up to 10 brightest colors (RGB + optional HSV) (configurable)
- Total number of unique colors
- ROI analysis (configurable region)
- Motion detection (frame diff)
- HSV color space conversion (optional)
- Pixel threshold filtering (filter out colors below % threshold)
It is implemented using Nitro Modules and runs synchronously on the native thread for use as a Vision Camera frame processor, while also exposing an async Nitro API for offline image analysis.
---
---
> This plugin is built to be used as a Vision Camera frame processor. It does NOT process images from gallery or URLs via the frame-processor path — use the async Nitro API for that.
---
bash
npx expo install react-native-vision-camera
npm install react-native-camera-vision-pixel-colors
`Add to
app.json (or app.config.js) plugins:
`json
{
"expo": {
"plugins": [
"react-native-vision-camera",
"react-native-camera-vision-pixel-colors"
]
}
}
`Then:
`bash
npx expo prebuild
eas build -p all
`---
Usage
$3
`ts
import { useFrameProcessor } from 'react-native-vision-camera';
import { analyzePixelColors, type PixelColorsResult } from 'react-native-camera-vision-pixel-colors';const frameProcessor = useFrameProcessor((frame) => {
'worklet';
const result: PixelColorsResult = analyzePixelColors(frame);
// result => { uniqueColorCount, topColors: [{r,g,b}], brightestColors: [{r,g,b}] }
console.log(result);
}, []);
`Attach to
as frameProcessor.$3
Pass optional analysis options to enable additional features:
`ts
import { useFrameProcessor } from 'react-native-vision-camera';
import { analyzePixelColors, type AnalysisOptions } from 'react-native-camera-vision-pixel-colors';const frameProcessor = useFrameProcessor((frame) => {
'worklet';
const options: AnalysisOptions = {
// Analyze only center 20% of frame
roi: { x: 0.4, y: 0.4, width: 0.2, height: 0.2 },
// Enable motion detection
enableMotionDetection: true,
motionThreshold: 0.1, // 0-1, default: 0.1
// Configure color counts (1-10, default: 3)
maxTopColors: 5,
maxBrightestColors: 5,
// Enable HSV color space conversion
enableHsvAnalysis: true,
// Filter out colors below 0.2% of total pixels
minPixelThreshold: 0.002,
};
const result = analyzePixelColors(frame, options);
if (result.motion?.hasMotion) {
console.log('Motion detected!', result.motion.score);
}
// Access HSV values (when enableHsvAnalysis=true)
const topColor = result.topColors[0];
if (topColor?.hsv) {
console.log(
Hue: ${topColor.hsv.h}, Sat: ${topColor.hsv.s}, Val: ${topColor.hsv.v});
} // Access pixel percentage (when minPixelThreshold is set)
if (topColor?.pixelPercentage) {
console.log(
This color represents ${topColor.pixelPercentage * 100}% of pixels);
}
}, []);
`$3
`ts
import { CameraVisionPixelColors, type ImageData } from 'react-native-camera-vision-pixel-colors';const imageData: ImageData = { width, height, data: arrayBuffer }; // data: ArrayBuffer (RGBA)
const result = await CameraVisionPixelColors.analyzeImageAsync(imageData);
`---
Output format
All types are exported from the library:
`ts
import {
type RGBColor,
type HSVColor,
type ColorInfo,
type PixelColorsResult,
type ImageData,
type AnalysisOptions,
type ROIConfig,
type MotionResult,
} from 'react-native-camera-vision-pixel-colors';
``ts
type RGBColor = { r: number; g: number; b: number };type HSVColor = {
h: number; // 0-360 (hue)
s: number; // 0-100 (saturation)
v: number; // 0-100 (value/brightness)
};
type ColorInfo = {
r: number;
g: number;
b: number;
hsv?: HSVColor; // present when enableHsvAnalysis=true
pixelPercentage?: number; // 0-1, present when minPixelThreshold is set
};
type ROIConfig = {
x: number; // 0-1 normalized
y: number; // 0-1 normalized
width: number; // 0-1 normalized
height: number; // 0-1 normalized
};
type AnalysisOptions = {
enableMotionDetection?: boolean; // default: false
motionThreshold?: number; // default: 0.1
roi?: ROIConfig; // if provided, analyze only this region
maxTopColors?: number; // default: 3, range: 1-10
maxBrightestColors?: number; // default: 3, range: 1-10
enableHsvAnalysis?: boolean; // default: false
minPixelThreshold?: number; // 0-1, e.g., 0.002 = 0.2%
};
type MotionResult = {
score: number; // 0-1
hasMotion: boolean; // score > threshold
};
type PixelColorsResult = {
uniqueColorCount: number;
topColors: ColorInfo[]; // extends RGBColor with optional hsv/pixelPercentage
brightestColors: ColorInfo[]; // extends RGBColor with optional hsv/pixelPercentage
motion?: MotionResult; // always present if enableMotionDetection=true
roiApplied?: boolean; // true if ROI config was provided
totalPixelsAnalyzed?: number; // present when HSV or threshold enabled
};
type ImageData = {
width: number;
height: number;
data: ArrayBuffer; // RGBA pixel data
};
`---
Architecture summary
- Frame Processor path: synchronous, returns the latest cached result (0–1 frame latency).
- Async Nitro API: full GPU/CPU pipeline, returns an up-to-date result (Promise-based).
- Shared native engine (iOS/Android) exposes analyzeAsync(...) and analyzeSync() for the frame-processor path to read cached results.$3
- Motion detection: Uses grayscale comparison with configurable threshold
- ROI: Crops before analysis for improved performance on smaller regions
- First frame motion: Returns {score: 0, hasMotion: false}` (not null)---
---