The simplest way to create AI conversational flows. Define agents and flows declaratively, let the LLM do the heavy lifting.
The simplest way to create AI conversational flows.
Define your agent and conversation flows declaratively. Let the LLM handle the complexity.
- Simple fluent API for agents and flows
- LLM-powered extraction (no regex)
- Multi-language support
- Provider agnostic (Ollama, OpenAI, OpenRouter, etc.)
- Flexible and strict modes
- Tool calling
- Streaming support on supported providers
``bash`
npm install @andresaya/flowkitor
pnpm add @andresaya/flowkit
`typescript
import {
agent, flow, FlowEngine, MemoryStorage, OllamaAdapter,
name, yesNo, oneOf,
} from "@andresaya/flowkit";
const assistant = agent("Alex")
.company("ACME Corp")
.personality("friendly, helpful")
.language("en")
.build();
const supportFlow = flow("support", assistant)
.ask("greeting", "Hi! I'm Alex. What's your name?", name(), "customer_name")
.then("ask_type")
.ask(
"ask_type",
"Nice to meet you, {{customer_name}}! How can I help?",
oneOf(["billing", "technical", "other"]),
"issue_type"
)
.when({ billing: "billing_help", technical: "tech_help", other: "general_help" })
.say("billing_help", "I'll transfer you to our billing team, {{customer_name}}.")
.done()
.say("tech_help", "Let me connect you with technical support.")
.done()
.say("general_help", "How can I assist you today?")
.done()
.build();
const engine = new FlowEngine(supportFlow, {
llm: new OllamaAdapter({ model: "llama3.2" }),
storage: new MemoryStorage(),
});
const result = await engine.start("session-1");
console.log(result.message);
const response = await engine.handle("session-1", "I'm John");
console.log(response.message);
`
See the documentation site or the docs/ folder:
- Getting Started: docs/guide/quick-start.mddocs/guide/agents.md
- Agents: docs/guide/flows.md
- Flows: docs/guide/extractors.md
- Extractors: docs/providers/
- Providers: docs/guide/storage.md
- Storage: docs/guide/tools.md
- Tools: docs/api/
- API Reference:
`bash`
pnpm run docs:dev
pnpm run docs:build
pnpm run docs:preview
`typescript
// Ollama (Local)
import { OllamaAdapter } from "@andresaya/flowkit";
const llm = new OllamaAdapter({ model: "llama3.2" });
// OpenAI
import { OpenAIAdapter } from "@andresaya/flowkit";
const llm = new OpenAIAdapter({ apiKey: "...", model: "gpt-4o-mini" });
// OpenRouter (100+ models)
import { OpenRouterAdapter } from "@andresaya/flowkit";
const llm = new OpenRouterAdapter({ apiKey: "...", model: "anthropic/claude-3-5-sonnet" });
`
Working examples in examples/:
- examples/01-simple-chat.ts - Basic flexible mode exampleexamples/02-strict-flow.ts
- - Strict mode customer service flowexamples/feature-tools.ts
- - Tool callingexamples/feature-memory.ts
- - Persistence with SQLiteexamples/feature-handoff.ts
- - Handoff + timeoutsexamples/provider-openai.ts
- - OpenAI adapter setupexamples/provider-openrouter.ts
- - OpenRouter adapter setup
See examples/README.md for the full list.
`bash`
pnpm run dev
pnpm run dev:simple
pnpm run dev:openai
Read CONTRIBUTING.md for details.
MIT - see LICENSE`.