Modern LLM integration framework for TypeScript with tool support and structured output generation
npm install mojentic



A modern LLM integration framework for TypeScript with full feature parity across Python, Elixir, and Rust implementations. Perfect for building VS Code extensions, Obsidian plugins, and Node.js applications.
- ๐ Multi-Provider Support: OpenAI and Ollama gateways
- ๐ค Agent System: Complete event-driven agent framework with ReAct pattern
- ๐ ๏ธ Tool System: Extensible function calling with automatic recursive execution
- ๐ Structured Output: Type-safe response parsing with JSON schemas
- ๐ Streaming: Real-time streaming with full tool calling support
- ๐ Tracer System: Complete observability for debugging and monitoring
- ๐ Type-Safe: Full TypeScript support with comprehensive type definitions
- ๐ฏ Result Type Pattern: Rust-inspired error handling for robust code
- ๐ฆ 24 Examples: Comprehensive examples demonstrating all features
``bash`
npm install mojenticor
yarn add mojenticor
pnpm add mojentic
To use Mojentic with local models, you need Ollama installed and running:
1. Install Ollama from ollama.ai
2. Pull a model: ollama pull qwen3:32bollama list
3. Verify it's running:
`typescript
import { LlmBroker, OllamaGateway, Message } from 'mojentic';
import { isOk } from 'mojentic';
const gateway = new OllamaGateway();
const broker = new LlmBroker('qwen3:32b', gateway);
const messages = [Message.user('What is TypeScript?')];
const result = await broker.generate(messages);
if (isOk(result)) {
console.log(result.value);
} else {
console.error(result.error);
}
`
`typescript
import { LlmBroker, OllamaGateway, Message } from 'mojentic';
import { isOk } from 'mojentic';
interface SentimentAnalysis {
sentiment: string;
confidence: number;
reasoning: string;
}
const gateway = new OllamaGateway();
const broker = new LlmBroker('qwen3:32b', gateway);
const schema = {
type: 'object',
properties: {
sentiment: { type: 'string', enum: ['positive', 'negative', 'neutral'] },
confidence: { type: 'number', minimum: 0, maximum: 1 },
reasoning: { type: 'string' },
},
required: ['sentiment', 'confidence', 'reasoning'],
};
const messages = [
Message.user('I love this new framework!'),
];
const result = await broker.generateObject
if (isOk(result)) {
console.log(Sentiment: ${result.value.sentiment});Confidence: ${(result.value.confidence * 100).toFixed(1)}%
console.log();`
}
`typescript
import { LlmBroker, OllamaGateway, Message, DateResolverTool } from 'mojentic';
import { isOk } from 'mojentic';
const gateway = new OllamaGateway();
const broker = new LlmBroker('qwen3:32b', gateway);
const tools = [new DateResolverTool()];
const messages = [
Message.system('You are a helpful assistant with access to tools.'),
Message.user('What day of the week is next Friday?'),
];
// The broker automatically handles tool calls
const result = await broker.generate(messages, tools);
if (isOk(result)) {
console.log(result.value);
}
`
`typescript
import { LlmBroker, OllamaGateway, Message } from 'mojentic';
import { isOk } from 'mojentic';
const gateway = new OllamaGateway();
const broker = new LlmBroker('qwen3:32b', gateway);
const messages = [Message.user('Write a short poem about TypeScript')];
for await (const chunk of broker.generateStream(messages)) {
if (isOk(chunk)) {
process.stdout.write(chunk.value);
}
}
`
Monitor and debug your LLM applications:
`typescript
import { LlmBroker, OllamaGateway, Message, TracerSystem } from 'mojentic';
import { DateResolverTool } from 'mojentic';
import { isOk } from 'mojentic';
// Create a tracer system
const tracer = new TracerSystem();
const gateway = new OllamaGateway();
const broker = new LlmBroker('qwen3:32b', gateway, tracer);
const tools = [new DateResolverTool()];
// Generate unique correlation ID for tracing related events
const correlationId = crypto.randomUUID();
const messages = [
Message.user('What day is next Friday?'),
];
const result = await broker.generate(messages, tools, {}, 10, correlationId);
// Query tracer events
const allEvents = tracer.getEvents();
console.log(Recorded ${allEvents.length} events);
// Filter by correlation ID
const relatedEvents = tracer.getEvents({
filterFunc: (e) => e.correlationId === correlationId
});
// Print event summaries
relatedEvents.forEach(event => {
console.log(event.printableSummary());
});
`
See Tracer Documentation for comprehensive usage guide.
Mojentic is structured in three layers:
- LlmBroker - Main interface for LLM interactions
- LlmGateway interface - Abstract interface for LLM providers
- OllamaGateway / OpenAiGateway - Provider implementations
- ChatSession - Conversational session management
- TokenizerGateway - Token counting with tiktoken
- EmbeddingsGateway - Vector embeddings
- Tool System - Extensible function calling with 10+ built-in tools
- TracerSystem - Complete event recording for observability
- EventStore - Flexible event storage and querying
- NullTracer - Zero-overhead when tracing is disabled
- Correlation ID tracking across requests
- AsyncDispatcher - Async event processing
- Router - Event-to-agent routing
- AsyncLlmAgent - LLM-powered async agents
- AsyncAggregatorAgent - Multi-event aggregation
- IterativeProblemSolver - Multi-step reasoning
- SimpleRecursiveAgent - Self-recursive processing
- SharedWorkingMemory - Agent context sharing
- ReAct pattern implementation
Implement the LlmTool interface:
`typescript
import { BaseTool, ToolArgs, ToolDescriptor, ToolResult } from 'mojentic';
import { Ok, Result } from 'mojentic';
export class WeatherTool extends BaseTool {
async run(args: ToolArgs): Promise
const location = args.location as string;
// Fetch weather data...
return Ok({
location,
temperature: 22,
condition: 'sunny',
});
}
descriptor(): ToolDescriptor {
return {
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather for a location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'City name',
},
},
required: ['location'],
},
},
};
}
}
`
Mojentic uses a Result type pattern inspired by Rust:
`typescript
import { Result, Ok, Err, isOk, isErr, unwrap, unwrapOr } from 'mojentic';
const result = await broker.generate(messages);
// Pattern 1: Check and narrow
if (isOk(result)) {
console.log(result.value); // Type: string
} else {
console.error(result.error); // Type: Error
}
// Pattern 2: Unwrap (throws on error)
const value = unwrap(result);
// Pattern 3: Unwrap with default
const value = unwrapOr(result, 'default value');
// Pattern 4: Map and transform
const mapped = mapResult(result, (text) => text.toUpperCase());
`
`typescript`
import {
MojenticError, // Base error
GatewayError, // API/network errors
ToolError, // Tool execution errors
ValidationError, // Input validation errors
ParseError, // JSON parsing errors
TimeoutError, // Timeout errors
} from 'mojentic';
Main interface for LLM interactions:
`typescript
class LlmBroker {
constructor(model: string, gateway: LlmGateway);
// Generate text completion
generate(
messages: LlmMessage[],
tools?: LlmTool[],
config?: CompletionConfig,
maxToolIterations?: number
): Promise
// Generate structured object
generateObject
messages: LlmMessage[],
schema: Record
config?: CompletionConfig
): Promise
// Generate streaming completion
generateStream(
messages: LlmMessage[],
config?: CompletionConfig,
tools?: LlmTool[]
): AsyncGenerator
// List available models
listModels(): Promise
// Get current model
getModel(): string;
}
`
`typescript`
class Message {
static system(content: string): LlmMessage;
static user(content: string): LlmMessage;
static assistant(content: string, toolCalls?: ToolCall[]): LlmMessage;
static tool(content: string, toolCallId: string, name: string): LlmMessage;
}
`typescript`
interface CompletionConfig {
temperature?: number;
maxTokens?: number;
topP?: number;
frequencyPenalty?: number;
presencePenalty?: number;
stop?: string[];
stream?: boolean;
responseFormat?: {
type: 'json_object' | 'text';
schema?: Record
};
}
Run any of the 24 included examples:
`bashInstall dependencies
npm install
๐๏ธ Architecture
Mojentic is structured in three layers:
$3
-
LlmBroker - Main interface for LLM interactions
- LlmGateway interface - Abstract interface for LLM providers
- OllamaGateway / OpenAiGateway - Provider implementations
- ChatSession - Conversational session management
- TokenizerGateway - Token counting with tiktoken
- EmbeddingsGateway - Vector embeddings
- Comprehensive tool system with 10+ built-in tools$3
-
TracerSystem - Event recording for observability
- EventStore - Flexible event storage and querying
- Correlation ID tracking across requests
- LLM call, response, and tool events$3
-
AsyncDispatcher - Async event processing
- Router - Event-to-agent routing
- AsyncLlmAgent - LLM-powered agents
- AsyncAggregatorAgent - Multi-event aggregation
- IterativeProblemSolver - Multi-step reasoning
- SimpleRecursiveAgent - Self-recursive processing
- SharedWorkingMemory - Agent context sharing
- ReAct pattern implementation๐ง Development
`bash
Install dependencies
npm installBuild
npm run buildRun tests
npm testRun tests with coverage
npm run test:coverageLint (zero warnings enforced)
npm run lintFormat
npm run formatFull quality check
npm run quality
``Contributions are welcome! This is part of the Mojentic family of implementations:
- mojentic-py - Python implementation (reference)
- mojentic-ex - Elixir implementation
- mojentic-ru - Rust implementation
- mojentic-ts - TypeScript implementation (this)
MIT License - see LICENSE for details
Mojentic is a Mojility product by Stacey Vetzal.