A provider-agnostic LLM core for Node.js, inspired by ruby-llm.
npm install @node-llm/core


The production-grade LLM engine for Node.js. Provider-agnostic by design.
@node-llm/core provides a single, unified API for interacting with over 540+ models across all major providers. It is built for developers who need stable infrastructure, standard streaming, and automated tool execution without vendor lock-in.
---
- Unified API: One interface for OpenAI, Anthropic, Gemini, DeepSeek, OpenRouter, and Ollama.
- Automated Tool Loops: Recursive tool execution handled automatically—no manual loops required.
- Streaming + Tools: Seamlessly execute tools and continue the stream with the final response.
- Structured Output: Native Zod support for rigorous schema validation (.withSchema()).
- Multimodal engine: Built-in handling for Vision, Audio (Whisper), and Video (Gemini).
- Security-First: Integrated circuit breakers for timeouts, max tokens, and infinite tool loops.
---
| Provider | Supported Features |
| :----------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------------------------------------------------- |
| OpenAI | Chat, Streaming, Tools, Vision, Audio, Images, Transcription, Reasoning |
| Anthropic | Chat, Streaming, Tools, Vision, PDF, Structured Output, Extended Thinking (Claude 3.7) |
| Gemini | Chat, Streaming, Tools, Vision, Audio, Video, Embeddings |
| DeepSeek | Chat (V3), Extended Thinking (R1), Streaming, Tools |
| Bedrock | Chat, Streaming, Tools, Image Gen (Titan/SD), Embeddings, Prompt Caching |
| OpenRouter | 540+ models, Chat, Streaming, Tools, Vision, Embeddings, Reasoning |
| Ollama | Local Inference, Chat, Streaming, Tools, Vision, Embeddings |
---
``bash`
npm install @node-llm/core
NodeLLM automatically reads your API keys from environment variables (e.g., OPENAI_API_KEY).
`ts
import { createLLM } from "@node-llm/core";
const llm = createLLM({ provider: "openai" });
// 1. Standard Request
const res = await llm.chat("gpt-4o").ask("What is the speed of light?");
console.log(res.content);
// 2. Real-time Streaming
for await (const chunk of llm.chat().stream("Tell me a long story")) {
process.stdout.write(chunk.content);
}
`
Stop parsing markdown. Get typed objects directly.
`ts
import { z } from "@node-llm/core";
const PlayerSchema = z.object({
name: z.string(),
powerLevel: z.number(),
abilities: z.array(z.string())
});
const chat = llm.chat("gpt-4o-mini").withSchema(PlayerSchema);
const response = await chat.ask("Generate a random RPG character");
console.log(response.parsed.name); // Fully typed!
`
---
NodeLLM protects your production environment with four built-in safety pillars:
`ts`
const llm = createLLM({
requestTimeout: 15000, // 15s DoS Protection
maxTokens: 4096, // Cost Protection
maxRetries: 3, // Retry Storm Protection
maxToolCalls: 5 // Infinite Loop Protection
});
---
NodeLLM 1.9.0 introduces a powerful lifecycle hook system for audit, security, and observability.
`ts
import { createLLM, PIIMaskMiddleware, UsageLoggerMiddleware } from "@node-llm/core";
const llm = createLLM({
provider: "openai",
middlewares: [
new PIIMaskMiddleware(), // Redact emails/phone numbers automatically
new UsageLoggerMiddleware() // Log structured token usage & costs
]
});
// All chats created from this instance inherit these middlewares
const chat = llm.chat("gpt-4o");
`
Middlewares can control the engine's recovery strategy during tool failures.
`ts``
const safetyMiddleware = {
name: "Audit",
onToolCallError: async (ctx, tool, error) => {
if (tool.function.name === "delete_user") return "STOP"; // Kill the loop
return "RETRY"; // Attempt recovery
}
};
---
Looking for persistence? use @node-llm/orm.
- Automatically saves chat history to PostgreSQL/MySQL/SQLite via Prisma.
- Tracks tool execution results and API metrics (latency, cost, tokens).
---
Visit node-llm.eshaiju.com for:
- Deep Dive into Tool Calling
- Multi-modal Vision & Audio Guide
- Custom Provider Plugin System
---
MIT © [NodeLLM Contributors]