React Native MLKit wrapper
npm install react-native-mlkit-lightLightweight React Native wrapper for Google MLKit Face Detection.
- 🔍 Face Detection - Detect faces in images with comprehensive data
- 📱 Cross Platform - iOS and Android support
- ⚡ Expo Module - Built using Expo Modules API with new architecture support
- 🎯 MLKit Powered - Uses Google's on-device ML models
- 🌐 Flexible Input - Support for both local files and remote URLs
``bash`
npm install react-native-mlkit-lightor
yarn add react-native-mlkit-lightor
bun add react-native-mlkit-light
For iOS, run npx pod-install after installation.
By default, the library is enabled on all platforms. If you need to conditionally disable iOS support (e.g., for arm64 simulators where MLKit is not supported - iOS 26 simulators run exclusively on arm64), add the plugin to your app.config.js:
`javascript`
export default {
// ... other config
plugins: [
[
"react-native-mlkit-light",
{
enableIOS: false // Set to false to disable iOS MLKit and use stub implementation (useful for arm64 simulators)
}
]
]
};
Plugin Options:
- enableIOS (boolean, default: true) - Whether to enable MLKit Face Detection on iOS. When disabled, the library will use a stub implementation that throws an error explaining MLKit is not available. Useful for arm64 simulators where MLKit is not supported.
`typescript
import { detectFaces, FaceDetectionOptions, DetectionResult } from 'react-native-mlkit-light';
const handleFaceDetection = async () => {
try {
const options: FaceDetectionOptions = {
performanceMode: 'accurate', // 'fast' | 'accurate'
landmarkMode: 'all', // 'none' | 'all'
classificationMode: 'all', // 'none' | 'all'
minFaceSize: 0.1, // minimum face size (0.0 - 1.0)
trackingEnabled: false // enable face tracking
};
const result: DetectionResult = await detectFaces(
'https://example.com/image.jpg', // local file or URL
options
);
console.log(Found ${result.faces.length} faces);Face ${index + 1}:
result.faces.forEach((face, index) => {
console.log(, {`
bounds: face.bounds,
smiling: face.smilingProbability,
leftEyeOpen: face.leftEyeOpenProbability,
rightEyeOpen: face.rightEyeOpenProbability,
landmarks: face.landmarks?.length || 0
});
});
} catch (error) {
console.error('Face detection failed:', error);
}
};
`typescript
// Import specific helper functions
import { detectFacesFromUrl, detectFacesFromFile } from 'react-native-mlkit-light';
// Detect from URL
const result1 = await detectFacesFromUrl('https://example.com/photo.jpg');
// Detect from local file
const result2 = await detectFacesFromFile('file:///path/to/image.jpg');
// Import native module directly (legacy)
import ReactNativeMlkitLight from 'react-native-mlkit-light';
const result3 = await ReactNativeMlkitLight.detectFaces(imageUri);
`
Parameters:
- imageUri - Local file URI or HTTP/HTTPS URL to imageoptions
- - Face detection configuration (optional)
Returns:
- DetectionResult - Object containing detected faces data
typescript
interface Face {
bounds: FaceBounds; // Face bounding box
rollAngle?: number; // Head rotation (Z-axis)
pitchAngle?: number; // Head rotation (X-axis)
yawAngle?: number; // Head rotation (Y-axis)
leftEyeOpenProbability?: number; // 0.0-1.0
rightEyeOpenProbability?: number; // 0.0-1.0
smilingProbability?: number; // 0.0-1.0
landmarks?: FaceLandmark[]; // Facial landmarks
}
`$3
`typescript
interface FaceDetectionOptions {
performanceMode?: 'fast' | 'accurate'; // Detection speed vs accuracy
landmarkMode?: 'none' | 'all'; // Detect facial landmarks
classificationMode?: 'none' | 'all'; // Eye/smile classification
minFaceSize?: number; // Min face size (0.0-1.0)
trackingEnabled?: boolean; // Enable face tracking IDs
}
`Platform Support
- ✅ iOS - Requires iOS 15.5+
- ✅ Android - Requires API level 21+
- ❌ Web - Not supported
Dependencies
Uses Google MLKit Vision APIs:
- iOS:
GoogleMLKit/FaceDetection
- Android: com.google.mlkit:face-detection`MIT