TypeScript LLM client with streaming tool execution. Tools fire mid-stream. Built-in function calling works with any model—no structured outputs or native tool support required.
npm install llmistStreaming-first multi-provider LLM client in TypeScript with home-made tool calling.
llmist implements its own tool calling syntax called "gadgets" - tools execute the moment their block is parsed, not after the response completes. Works with any model that can follow instructions.
``bash`
npm install llmist
`typescript
import { Gadget, LLMist, z } from 'llmist';
// Define a gadget (tool) with Zod schema
class Calculator extends Gadget({
description: 'Performs arithmetic operations',
schema: z.object({
operation: z.enum(['add', 'subtract', 'multiply', 'divide']),
a: z.number(),
b: z.number(),
}),
}) {
execute(params: this['params']): string {
const { operation, a, b } = params;
switch (operation) {
case 'add': return String(a + b);
case 'subtract': return String(a - b);
case 'multiply': return String(a * b);
case 'divide': return String(a / b);
}
}
}
// Run the agent
const answer = await LLMist.createAgent()
.withModel('sonnet')
.withGadgets(Calculator)
.askAndCollect('What is 15 times 23?');
console.log(answer);
`
- Streaming-first - Tools execute mid-stream, not after response completes
- Multi-provider - OpenAI, Anthropic, Gemini, HuggingFace with unified API
- Type-safe - Full TypeScript inference from Zod schemas
- Flexible hooks - Observers, interceptors, and controllers for deep integration
- Built-in cost tracking - Real-time token counting and cost estimation
- Multimodal - Vision and audio input support
Set one of these environment variables:
`bash`
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export GEMINI_API_KEY="..."
export HF_TOKEN="hf_..."
Use model aliases for convenience:
`typescript`
.withModel('sonnet') // Claude 3.5 Sonnet
.withModel('opus') // Claude Opus 4
.withModel('gpt4o') // GPT-4o
.withModel('flash') // Gemini 2.0 Flash
Full documentation at llmist.dev
- Getting Started
- Creating Gadgets
- Hooks System
- Provider Configuration
See the examples directory for runnable examples covering all features.
- @llmist/cli - Command-line interface
- @llmist/testing` - Testing utilities and mocks
MIT