Expo module for Apple's Foundation Models (on-device LLM) and CoreML
npm install expo-foundation-modelsExpo module for Apple's Foundation Models (on-device LLM with Apple Intelligence) and CoreML integration.


expo-foundation-models provides React Native/Expo bindings for Apple's on-device language model, enabling private, fast AI text generation without sending data to external servers. It also includes CoreML support for running custom machine learning models.
| Feature | Description |
|---------|-------------|
| On-device LLM | Private text generation using Apple Intelligence |
| Structured Output | Generate JSON conforming to schemas |
| Tool Calling | Let the model call your functions |
| Streaming | Token-by-token response streaming |
| Session Management | Conversation history, prewarm, resume |
| Adapters | Load and use fine-tuned models |
| CoreML | Run any CoreML model |
| Feature | iOS | Android |
|---------|-----|---------|
| Foundation Models | iOS 26.0+ | Not supported |
| CoreML | iOS 16.0+ | Not supported |
> Note: Foundation Models requires an Apple Silicon device with Apple Intelligence enabled in Settings.
> iOS 26 Beta Notice: The Foundation Models framework is in beta with rapidly changing APIs. Some features use workarounds:
> - Structured Output: Uses prompt-based JSON generation instead of DynamicGenerationSchema. The schema is included in the prompt and the model's JSON response is parsed. Results may vary.
> - Tool Calling: Tool result submission API has changed and is stubbed.
> - Session Transcript: Some transcript features use workarounds for API compatibility.
>
> See GitHub Issue #1 for details.
``bash`
npx expo install expo-foundation-models
For bare React Native projects:
`bash`
npm install expo-foundation-models
npx pod-install
`typescript
import { FoundationModels } from 'expo-foundation-models';
// Check availability
if (FoundationModels.isAvailable()) {
// Create a session
const sessionId = await FoundationModels.createSession('You are a helpful assistant.');
// Generate a response
const response = await FoundationModels.respond(sessionId, 'Hello!');
console.log(response);
// Clean up
await FoundationModels.closeSession(sessionId);
}
`
| Guide | Description |
|-------|-------------|
| Getting Started | Installation, setup, and basic usage |
| Foundation Models | Text generation, streaming, and options |
| Structured Output | JSON Schema generation and choices |
| Tool Calling | Function calling and tool execution |
| Session Management | Transcripts, prewarm, and history |
| Adapters | Training and loading fine-tuned models |
| Feedback | Response quality logging |
| CoreML | Custom ML model integration |
| Error Handling | Error types and handling patterns |
Check out the example app for a complete demo showcasing all features.
`bash`
cd example
npx expo run:ios
- Expo SDK 54+
- iOS 26.0+ for Foundation Models
- iOS 16.0+ for CoreML only
- Apple Intelligence enabled in device Settings
- Device with Apple Silicon (A17+ for Foundation Models)
Want to create custom adapters for specialized tasks? Apple provides a Python toolkit for training adapters using LoRA (Low-Rank Adaptation).
| Task | Tool |
|------|------|
| Train adapters | Apple's Python Toolkit |
| Use adapters | This package |
- Mac with Apple Silicon + 32GB RAM, or Linux GPU
- Python 3.11+
- 100-5,000+ training samples (prompt/response pairs)
`bash1. Download toolkit from Apple Developer
2. Set up environment
conda create -n adapter-training python=3.11
pip install -r requirements.txt
Then load in your app:
`typescript
const adapter = await FoundationModels.loadAdapterFromFile('/path/to/my_adapter.fmadapter');
const sessionId = await FoundationModels.createSession({
adapterId: adapter.id,
instructions: 'You are a specialized assistant.',
});
``See Adapters Documentation for complete details.
MIT
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.