Embeddable AI chatbot widget for BAWANA learning solutions - supports React, Vue, and vanilla JavaScript
npm install @rizal_ncc/bawanachatEmbeddable AI chatbot widgets for the BAWANA learning ecosystem. Ship the same conversational experience across vanilla JavaScript, React, and Vue with a single, framework-agnostic core.
- Shared core (ChatbotCore) with adapter wrappers for Vanilla, React, and Vue
- Ready-to-ship UI with widget or full-page layouts
- OpenAI integration (chat completions + function calling) with configurable model, base URL, and sampling settings
- One-line theming: tweak the primary color and the widget, launcher, and animations all stay in sync
- Inline Markdown styling (bold/italic/code) with automatic sanitization
- Lightweight topic classifier keeps prompts on scope (company/course vs. cooking/travel)
- TypeScript definitions for all public APIs
- Packaged CSS bundle (style.css) for consistent styling
- ChatView: layout-agnostic core that renders messages, composer, and suggestions; emits events; never manages wrappers or ARIA.
- Layout shells: PageLayout, WidgetLayout, DropdownLayout own their DOM, toggles, ARIA, and open/close state while mounting ChatView.
- Dropdown interactions: isolated outside-click, ESC, and ARIA syncing for the dropdown shell.
- DOM builders: per-layout structure builders with shared header/message/composer partials.
Playground to sanity-check all shells: examples/layouts/index.html (uses the source modules so you can tweak layouts rapidly).
- Layout playground: npm run dev -- --host --port 4173 then open http://localhost:4173/examples/layouts/ to try page/widget/dropdown shells.
- Vanilla demo: http://localhost:4173/examples/vanilla/
- React demo: http://localhost:4173/examples/react/
- Vue demo: http://localhost:4173/examples/vue/
If you prefer using the built bundle, swap imports in an example from ../../src/... to ../../dist/index.mjs and ../../dist/style.css.
``bash`
npm install @rizal_ncc/bawanachator
yarn add @rizal_ncc/bawanachat
You must provide your own OpenAI API key at runtime.
Every adapter shares the same options. Import the CSS bundle once per app (@rizal_ncc/bawanachat/style.css) so the widget, dropdown, and launcher stay styled.
#### Floating widget launcher (default)
`html`
#### Inline dropdown panel
`html`
`bash`
npm install @rizal_ncc/bawanachat react react-dom
#### Floating widget (launcher + modal)
`tsx
import { BawanaChatbot } from "@rizal_ncc/bawanachat/react";
import "@rizal_ncc/bawanachat/style.css";
export function App() {
return (
headerTitle="BAWANA Assistant"
headerDescription="Tanyakan apa saja tentang solusi digital Netpolitan."
suggestedMessages={[
"Apa saja komponen BAWANA 3-in-1?",
"Bagaimana LXP BAWANA memanfaatkan AI?",
]}
primaryColor="#2563eb"
primaryForeground="#f8fafc"
/>
);
}
`
#### Inline widget/dropdown
`tsx`
export function CoursePage() {
return (
{/ ...course content... /}
layout="dropdown"
className="mt-6"
apiKey={import.meta.env.VITE_OPENAI_API_KEY}
headerTitle="Course Mentor"
headerDescription="Online β’ siap bantu"
/>
);
}
Setting floating={false} tells the adapter to render inside the JSX tree instead of creating a global launcher. You can still choose between layout="widget", "dropdown", or "page" to control how the inline version looks.
`bash`
npm install @rizal_ncc/bawanachat vue
`vue
layout="dropdown"
header-title="Course Mentor"
header-description="Online β’ siap bantu"
primary-color="#9333ea"
primary-foreground="#faf5ff"
/>
`
| Option | Type | Default | Description |
| --- | --- | --- | --- |
| apiKey | string | β | OpenAI API key (required) |model
| | string | gpt-4o-mini | Model to use for responses |baseUrl
| | string | https://api.openai.com/v1 | Override OpenAI endpoint |chatApiUrl
| | string | β | OpenAI-compatible chat endpoint (use your backend proxy). Accepts a base URL or full /chat/completions path. |layout
| | "widget" \| "page" \| "dropdown" | widget | widget creates a floating bubble, page keeps the chat always visible inline, dropdown renders a collapsible course-friendly panel |headerTitle
| | string | β | Title displayed in chat header |headerDescription
| | string | β | Subtitle in header |placeholder
| | string | β | Input placeholder text |suggestedMessages
| | string[] | [] | Prompt suggestions displayed before first message |userInitials
| | string | "YOU" | Avatar initials for user bubbles |assistantInitials
| | string | "AI" | Avatar initials for assistant bubbles |contextMessage
| | boolean \| string | false | Enable bundled BAWANA company context (true) or supply a custom string |primaryColor
| | string | #1168bb | Primary accent used for the header, launcher, and motion effects |primaryForeground
| | string | auto | Foreground color on top of the primary accent (auto-calculated for contrast) |temperature
| | number | 0.6 | Sampling temperature |maxOutputTokens
| | number | 512 | Maximum tokens in the model response |requestMode
| | "auto" \| "proxy" \| "sdk" | auto | auto uses proxy when chatApiUrl is set; proxy forces backend calls; sdk forces direct SDK usage (server-side only). |recommendationsApiUrl
| | string | β | Optional HTTP endpoint for course recommendations. When set, the model will use function calling to query this endpoint (expects repeatable ?q= search params, e.g. http://ncc-stg.api.bawana:8000/api/v2/course-recommendations/ so it can send a main query plus related keywords like leadership + management). |recommendationsHeaders
| | Record | β | Extra headers for the recommendation API (e.g., { Authorization: 'Bearer β¦' }). |recommendationsAuthToken
| | string | β | Convenience token to auto-set Authorization: Bearer for the recommendation API. |generateResponse
| | (request) => Promise<{ content: string }> | Built-in callOpenAI wrapper | Override the network transport (e.g. route through your API or mock in tests) |contextProfile
| | string \| false | "company-intro" | Select predefined context/policy bundle (set to false to disable) |systemPrompt
| | string | β | Short system instruction prepended to conversation |systemMessages
| | { content: string }[] | [] | Additional system messages before user history |onMessage
| | (event) => void | β | Fired after assistant replies |onError
| | (error) => void | β | Fired when OpenAI call fails |onOpen
| | () => void | β | Triggered when the widget opens |onClose
| | () => void | β | Triggered when the widget closes |
> React only: pass floating={false} to render the chat inline instead of the default floating widget.
> Tip: If you need to customise the copy, COMPANY_CONTEXT is still exported so you can compose your own systemMessages.
generateResponse receives an object that includes messages, the active AbortSignal, and a copy of the resolved config so you can talk to any LLM provider without touching the UI.
Need an inline course assistant? Pass layout="dropdown" (plus floating={false} in React) and the widget renders as a collapsible panel (closed by default). The outer title/description become the trigger label, while the inner chat header stays hidden so the panel doesnβt feel double-stacked.
Context profiles bundle prompts, company context, history limits, and light policy checks so each chatbot instance can focus on a specific surface (e.g., company introduction vs. course pages).
- company-intro (default): answers general questions about Netpolitan Group/BAWANA.course-page
- : keeps the conversation centred on one course/module and gently redirects off-topic queries.
Each profile can enforce topic allow-lists and custom error messages. For example, the default profile refuses unrelated requests ("please provide a noodle cooking tutorial") so the assistant stays focused on BAWANA, while short greetings or exploratory prompts are still allowed. Prompts run through a lightweight classifier that tags them as company, course, cooking, travel, etc., so the chatbot can decline out-of-scope topics before hitting OpenAI.
`js
import { classifyPrompt } from "@rizal_ncc/bawanachat";
console.log(classifyPrompt("Bagaimana cara membuat bandeng goreng?"));
// ["cooking"]
`
`js
import { initChatbot, listContextProfiles } from "@rizal_ncc/bawanachat";
console.log(listContextProfiles());
initChatbot({
apiKey: "sk-...",
contextProfile: "course-page",
systemMessages: [
{ content: "Course slug: digital-leadership" },
],
});
`
Set contextProfile: false if you want to bypass the built-in bundles and manage prompts/policies yourself.
Give the chatbot your brand colours by passing primaryColor (and optionally primaryForeground). The widget header, launcher button, card shadow, and toggle animations all re-use the same CSS variables, so a single change keeps everything consistent.
`js`
initChatbot({
apiKey: "...",
primaryColor: "#1d4ed8",
primaryForeground: "#f8fafc", // optional β picked automatically if omitted
});
If you need access to the resolved palette (e.g. to sync surrounding UI), call chatbot.getConfig() β it returns the current config including the computed primaryColor, primaryForeground, and primaryRgb values.
Creates a chatbot instance and mounts it into the provided container (or the body by default).
`js`
const chatbot = initChatbot({ apiKey: "sk-..." });
chatbot.open();
chatbot.sendMessage("Halo, BAWANA!");
Fine-grained control over lifecycle, events, and state.
`js
import { ChatbotCore } from "@rizal_ncc/bawanachat";
const chatbot = new ChatbotCore({ apiKey: "sk-..." });
chatbot.on("message", ({ response }) => console.log(response.content));
chatbot.init("chatbot-root");
`
Direct access to OpenAI Responses API helper used internally.
`js
import { callOpenAI } from "@rizal_ncc/bawanachat";
await callOpenAI({
apiKey: process.env.OPENAI_API_KEY,
messages: [
{ role: "system", content: "You are BAWANA AI." },
{ role: "user", content: "Apa saja fitur LXP?" },
],
});
`
Route traffic through your own backend, Azure OpenAI, or a mocked responder by passing generateResponse to any adapter or directly to ChatbotCore.
`js
const chatbot = new ChatbotCore({
apiKey: "placeholder", // optional if your backend handles auth
generateResponse: async ({ messages, signal }) => {
const res = await fetch("/api/chat", {
method: "POST",
signal,
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ messages }),
});
const data = await res.json();
return { content: data.reply, usage: data.usage };
},
});
`
Because the transport is concentrated behind one function, local tests can inject a fake implementation that returns deterministic responses without hitting OpenAI.
Inspect and reuse the shipped policy bundles.
`js
import {
listContextProfiles,
getContextProfile,
} from "@rizal_ncc/bawanachat";
const profiles = listContextProfiles();
const courseProfile = getContextProfile("course-page");
`
Build a profile-aware generator that you can pass into ChatbotCore's generateResponse hook.
`js
import { createDialogueEngine, ChatbotCore } from "@rizal_ncc/bawanachat";
const generateResponse = createDialogueEngine({ profileId: "course-page" });
const chatbot = new ChatbotCore({
apiKey: "sk-...",
generateResponse,
});
`
`bash`
npm install
npm run dev # Vite dev server
npm run build # Build library + types
npm test # Vitest smoke tests for ChatbotCore
npm run preview # Preview bundle locally
Set the following environment variables for local demos:
`bash`
cp .env.example .env
VITE_OPENAI_API_KEY="sk-..."
VITE_OPENAI_MODEL="gpt-4o-mini"Optional: VITE_OPENAI_BASE_URL="https://api.openai.com/v1"
Example projects live in the examples/ directory:
- examples/react β React + Vite demo with runtime profile picker.examples/vue
- β Vue 3 demo that toggles context profiles via
Each framework example is a self-contained Vite project so you can run it without extra wiring:
`bash`
npm run build # from repo root β produces dist/* for the adapters
cd examples/react
npm install
npm run dev
`bash`
npm run build # from repo root β produces dist/* for the adapters
cd examples/vue
npm install
npm run dev
For the vanilla sample, open examples/vanilla/index.html in a browser (or serve it through npx vite preview examples/vanilla).
> The React and Vue demos import compiled assets from ../../dist/*, so always runnpm run build
> in the repo root before starting those dev servers. They also read.env
> from the repository root (envDir), so set VITE_OPENAI_API_KEY there
> once and both examples will pick it up.
Use npm link or npm pack to validate integration locally:
`bash`
npm run build
npm pack
npm link
- [ ] Update package.json versionnpm run build
- [ ] npm pack --dry-run
- [ ] to inspect package contentsCHANGELOG.md
- [ ] Update npm publish --access public
- [ ] git tag vX.Y.Z && git push --tags
- [ ]
- Keep API keys out of version control
- Use GitHub Secrets for CI/CD tokens
- Run npm audit fix` regularly
- Add automated lint/test workflows before production release
Released under the MIT License.