Offline TFLite cattle breed detection for React Native
npm install rn-breed-detectorexpo-image-manipulator and jpeg-js for reliable preprocessing
bash
npm install rn-breed-detector react-native-fast-tflite expo-image-manipulator
`
$3
This package requires:
- react-native-fast-tflite (v0.2.0+) — TensorFlow Lite inference engine
- expo-image-manipulator (v11.0.0+ or v12.0.0+) — Image resizing and JPEG encoding
- jpeg-js (v0.4.4+) — JPEG decoding (automatically installed)
`bash
npm install react-native-fast-tflite expo-image-manipulator
`
---
Setup
$3
TensorFlow Lite models use the .tflite extension. You must configure Metro bundler to recognize this asset type.
Create or edit metro.config.js in your project root:
`js
const { getDefaultConfig } = require("metro-config");
module.exports = (async () => {
const config = await getDefaultConfig();
// Add .tflite as a recognized asset extension
config.resolver.assetExts.push("tflite");
return config;
})();
`
Why is this needed?
Metro doesn't bundle .tflite files by default. This configuration tells Metro to treat .tflite files as assets that can be required/imported.
$3
`bash
cd ios
pod install
cd ..
`
$3
⚠️ Expo Managed Workflow is NOT supported because this package requires native modules (react-native-fast-tflite).
Supported Workflows:
- ✅ React Native CLI
- ✅ Expo Bare Workflow (after running expo prebuild)
---
Usage
$3
`javascript
import { detectBreed } from "rn-breed-detector";
import { launchImageLibrary } from "react-native-image-picker";
async function pickAndDetect() {
// Pick an image
const result = await launchImageLibrary({
mediaType: "photo",
quality: 1,
});
if (result.assets && result.assets[0]) {
const imageUri = result.assets[0].uri;
// Detect breed
const prediction = await detectBreed(imageUri);
console.log("Breed:", prediction.breed);
console.log("Confidence:", prediction.confidence);
console.log("All scores:", prediction.scores);
}
}
`
$3
`javascript
import React, { useState } from "react";
import { View, Button, Text, Image, ActivityIndicator } from "react-native";
import { detectBreed } from "rn-breed-detector";
import { launchImageLibrary } from "react-native-image-picker";
export default function BreedDetector() {
const [imageUri, setImageUri] = useState(null);
const [result, setResult] = useState(null);
const [loading, setLoading] = useState(false);
const handlePickImage = async () => {
const pickerResult = await launchImageLibrary({
mediaType: "photo",
quality: 1,
});
if (pickerResult.assets && pickerResult.assets[0]) {
const uri = pickerResult.assets[0].uri;
setImageUri(uri);
// Run detection
setLoading(true);
try {
const prediction = await detectBreed(uri);
setResult(prediction);
} catch (error) {
console.error("Detection failed:", error);
} finally {
setLoading(false);
}
}
};
return (
{imageUri && (
source={{ uri: imageUri }}
style={{ width: 300, height: 300, marginTop: 20 }}
/>
)}
{loading && }
{result && (
Breed: {result.breed}
Confidence: {(result.confidence * 100).toFixed(2)}%
All Predictions:
{result.scores.slice(0, 3).map((item, idx) => (
{item.breed}: {(item.score * 100).toFixed(2)}%
))}
)}
);
}
`
---
API Reference
$3
Performs breed detection on an image.
Parameters:
- imageUri (string, required) — Local file URI (e.g., file:///path/to/image.jpg)
Returns: Promise
`typescript
interface DetectionResult {
breed: string; // The predicted breed label
confidence: number; // Confidence score [0, 1] for the top prediction
scores: Array<{ // All breeds sorted by score (descending)
breed: string;
score: number; // Probability [0, 1]
}>;
}
`
Example Output:
`json
{
"breed": "Sahiwal",
"confidence": 0.89,
"scores": [
{ "breed": "Sahiwal", "score": 0.89 },
{ "breed": "Gir", "score": 0.06 },
{ "breed": "Tharparkar", "score": 0.03 },
...
]
}
`
$3
Unloads the TFLite model from memory (optional cleanup).
`javascript
import { unloadModel } from "rn-breed-detector";
// When you're done with predictions
unloadModel();
`
---
Supported Breeds
The default model recognizes these 10 cattle breeds:
1. Sahiwal
2. Gir
3. Tharparkar
4. Red Sindhi
5. Rathi
6. Ongole
7. Kankrej
8. Hariana
9. Deoni
10. Kangayam
You can customize the model and labels (see Customization below).
---
How It Works (Internals)
$3
The .tflite model file is bundled inside the NPM package at src/model/model.tflite.
When you call detectBreed(), the package:
1. Loads the model using react-native-fast-tflite (cached after first load)
2. Preprocesses the image — resizes to 300×300, converts to RGB Float32
3. Runs inference on-device
4. Applies softmax to convert logits to probabilities
5. Returns results with breed labels and confidence scores
$3
Metro bundler doesn't recognize .tflite files by default. The metro.config.js modification allows:
- Bundling .tflite files as static assets
- Using require() to reference the model path
- Proper packaging for iOS/Android
$3
The model is loaded once on the first call to detectBreed() and cached in memory. Subsequent calls reuse the loaded interpreter, providing fast inference times.
$3
The preprocessing pipeline transforms camera images into tensors using a reliable, tested approach:
Step 1: Resize to 300×300
- Uses expo-image-manipulator to resize and encode as JPEG
- Returns Base64-encoded string
Step 2: Base64 to Binary
- Decodes Base64 string using native atob()
- Converts to Uint8Array
Step 3: JPEG Decoding
- Uses jpeg-js library to decode JPEG binary
- Produces RGBA pixel data (Uint8Array)
Step 4: RGB Extraction
- Iterates through RGBA data with stride of 4
- Extracts R, G, B channels (drops Alpha)
- Creates Float32Array of size 300×300×3 = 270,000
Step 5: Tensor Format
- Output: Float32Array with pixel values 0-255 (NOT normalized)
- Shape: [1, 300, 300, 3]
- Channel order: RGB (not BGR)
- Memory layout: Row-major, flat buffer [R, G, B, R, G, B, ...]
CRITICAL: Pixel values are kept in the range 0-255, not normalized to [0, 1]. Ensure your model expects this input format.
---
Customization
$3
To use your own trained model:
1. Train your model (TensorFlow/Keras, PyTorch → ONNX → TFLite, etc.)
2. Convert to TensorFlow Lite format
3. Replace src/model/model.tflite with your .tflite file
4. Update src/labels.json with your class labels
5. Rebuild the package
Model Requirements:
- Input shape: [1, 300, 300, 3]
- Input type: Float32
- Input value range: 0-255 (NOT normalized to [0, 1])
- Output shape: [1, num_classes] where num_classes matches labels.json length
- Output type: Float32 (logits, softmax applied internally)
$3
Edit src/labels.json:
`json
[
"YourBreed1",
"YourBreed2",
"YourBreed3"
]
`
Important: The order must match your model's output classes.
$3
`python
import tensorflow as tf
Load your trained model
model = tf.keras.models.load_model('cattle_classifier.h5')
Convert to TensorFlow Lite
converter = tf.lite.TFLiteConverter.from_keras_model(model)
Optional: Apply optimizations
converter.optimizations = [tf.lite.Optimize.DEFAULT]
Convert
tflite_model = converter.convert()
Save
with open('model.tflite', 'wb') as f:
f.write(tflite_model)
`
---
Building & Publishing
$3
`bash
npm run build
`
This transpiles src/ to dist/ using Babel and copies asset files.
$3
`bash
npm publish --access public
`
$3
When you update the model or labels:
1. Increment the package version in package.json (e.g., 1.0.0 → 1.1.0)
2. Rebuild: npm run build
3. Publish: npm publish
Users can then upgrade to get the new model:
`bash
npm install rn-breed-detector@latest
`
---
Troubleshooting
$3
Cause: Metro config not set up correctly.
Solution:
1. Ensure metro.config.js includes .tflite in assetExts
2. Restart Metro bundler: npx react-native start --reset-cache
$3
Cause: Missing peer dependency.
Solution:
`bash
npm install expo-image-manipulator
`
$3
Cause: Pods not installed or linked.
Solution:
`bash
cd ios
pod install
cd ..
npx react-native run-ios
`
$3
Cause: Invalid image URI or unsupported image format.
Solution:
- Ensure the URI is a valid local file path (starts with file://)
- Use JPEG or PNG images
- Check that the image file exists and is readable
npx react-native run-ios
`
$3
Cause: Model loading takes time on first call.
Solution: This is expected. The model is cached after the first load, so subsequent predictions are fast. You can preload the model on app startup:
`javascript
import { detectBreed } from "rn-breed-detector";
// Preload model on app start
useEffect(() => {
detectBreed("dummy_uri").catch(() => {
// Model is now cached
});
}, []);
``