SDK for querying Contentstack with LLMs
npm install contentstack-agent-sdkbash
npm install contentstack-agent-sdk
or
yarn add contentstack-agent-sdk
`
Usage
`bash
import { useAgent } from "contentstack-agent-sdk";
const { response, loading, error, askStream, isStreaming } = useAgent({
baseUrl: 'http://localhost:8080/api/v1/ask', // Replace with your backend URL
llmApiKey: process.env.GOOGLE_API_KEY!, // Replace with your LLM API key. Remove this "!" if you are not using Typescript
contentstackApiKey: process.env.NEXT_PUBLIC_CONTENTSTACK_API_KEY!, // Replace with your Contentstack API key. Remove this "!" if you are not using Typescript
llmProvider: 'groq', // or 'groq', depending on your LLM provider
});
`
Quickstart
`bash
'use client';
import { useState } from "react";
import { useAgent } from "contentstack-agent-sdk";
export default function Chatbot() {
const { response, loading, error, askStream, isStreaming } = useAgent({
baseUrl: "http://localhost:8080/api/v1/ask",
llmApiKey: process.env.GOOGLE_API_KEY!,
contentstackApiKey: process.env.NEXT_PUBLIC_CONTENTSTACK_API_KEY!,
llmProvider: "google",
});
const [input, setInput] = useState("");
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!input.trim()) return;
await askStream(input);
setInput("");
};
return (
Chatbot
{/ Messages /}
{loading && Thinking...
}
{error && {error.message}
}
{response && (
Bot: {response}
{isStreaming && ⌛}
)}
{/ Input /}
);
}
`
API Reference
useAgent(config)
Hook that connects your frontend to the backend AI service.
Config options:
| Key | Type | Required | Description |
| -------------------- | -------- | -------- | ------------------------------------------------ |
| baseUrl | string | ✅ | Backend endpoint to forward queries. |
| llmApiKey | string | ✅ | API key for your chosen LLM provider. |
| contentstackApiKey | string | ✅ | Contentstack API key used to fetch your content. |
| llmProvider | string | ✅ | LLM provider ("google" or "groq"). |
Note: Currently there are only 2 LLM providers google and groq.
Returns:
| Variable | Type | Description |
| ------------- | ---------------------------------- | --------------------------------------- |
| response | string | AI-generated response. |
| loading | boolean | Whether the request is in progress. |
| error | Error \| null | Error object if the request fails. |
| askStream | (query: string) => Promise | Function to send user input. |
| isStreaming | boolean` | True if AI response is still streaming. |