Tiny Express-like multiplexer for routing prompts to multiple AI providers with middleware
npm install llm-flow-router/gemini, /grok, /openai, etc.
bash
npm install llm-flow-router
`
Badges

!license
!TypeScript
Author: Divesh Sarkar
$3
Create .env in your project:
`javascript
Set env vars: OPENAI_API_KEY, GROK_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY
`
Usage
`javascript
import { mux } from 'llm-flow-router';
import 'dotenv/config'; // if not already loaded
const messages = [
{ role: 'user', content: 'Explain quantum computing like I\'m 12.' }
];
async function demo() {
// Fast & cheap: Gemini 2.5 Flash
const gemini = await mux.handle({ path: '/gemini', messages });
console.log('Gemini:', gemini);
// Reliable fallback
const safe = await mux.handle({ path: '/safe', messages });
console.log('Safe fallback:', safe);
}
demo();
`
$3
`bash /gemini → Gemini 2.5 Flash (fast, current stable)
/grok → Grok (xAI)
/openai → OpenAI
/claude → Anthropic Claude
/safe → Fallback: Gemini → Grok → OpenAI
/default → Gemini
`
$3
`javascript
mux.use(async (prompt, next) => {
console.log('Custom middleware running');
return next();
});
``