Reusable MCP client component with AI chat interface
A complete, ready-to-use React component and backend client for MCP (Model Context Protocol) AI chat with OpenAI integration. Includes both the UI component and the OpenAI-powered backend logic.
- 🎨 Complete UI Component - Ready-to-use chat interface with streaming support
- 🤖 OpenAI Integration - Built-in OpenAI client with conversation management
- 🔧 API Helpers - Simple Next.js API route helpers for instant setup
- 🎨 Customizable Styling - CSS variables for easy theming
- 📦 All-in-One Package - No need to configure MCP clients separately
``bash`
npm install @nqminds/mcp-client
Create a .env.local file:
`env`
OPENAI_API_KEY=your_openai_api_key
MCP_SERVER_COMMAND="node /path/to/your/mcp-server/build/index.js"
OPENAI_MODEL=chatgpt-5-mini
Create app/api/mcp/chat/route.ts:
`typescript
import { createMCPChatHandler, createMCPClearHandler } from "@nqminds/mcp-client/server";
const chatHandler = createMCPChatHandler({
openaiApiKey: process.env.OPENAI_API_KEY!,
mcpServerCommand: process.env.MCP_SERVER_COMMAND!,
openaiModel: process.env.OPENAI_MODEL,
});
const clearHandler = createMCPClearHandler();
export async function POST(req: Request) {
return chatHandler(req);
}
export async function DELETE(req: Request) {
return clearHandler(req);
}
`
`tsx
import { MCPChat } from '@nqminds/mcp-client';
import '@nqminds/mcp-client/dist/styles/MCPChat.css';
export default function Page() {
return (
$3
`tsx
import '@nqminds/mcp-client/dist/styles/MCPChat.css';
`That's it! Your MCP chat is ready to use.
Component Props
`typescript
interface MCPChatProps {
companyNumber?: string; // Optional context
apiEndpoint?: string; // Default: "/api/mcp/chat"
customStyles?: React.CSSProperties; // CSS variable overrides
className?: string; // Additional CSS class
}
`Custom Styling
Override CSS variables:
`tsx
const customStyles = {
'--mcp-primary-color': '#7c5cff',
'--mcp-border-radius': '24px',
} as React.CSSProperties;
`Available CSS variables:
-
--mcp-primary-color
- --mcp-bg
- --mcp-card-bg
- --mcp-text
- --mcp-text-secondary
- --mcp-border
- --mcp-border-radius
- --mcp-spacingAdvanced Usage
$3
If you need more control, use the client directly (server-side only):
`typescript
import { MCPClientOpenAI } from '@nqminds/mcp-client/server';const client = new MCPClientOpenAI({
openaiApiKey: process.env.OPENAI_API_KEY!,
mcpServerCommand: process.env.MCP_SERVER_COMMAND!,
openaiModel: "chatgpt-5-mini",
});
await client.connect();
const response = await client.processQuery("Hello!", (thinking) => {
console.log(thinking);
});
console.log(response);
await client.cleanup();
`$3
Create your own streaming handler (server-side):
`typescript
import { MCPClientOpenAI } from '@nqminds/mcp-client/server';export async function POST(req: Request) {
const { message } = await req.json();
const client = new MCPClientOpenAI({
openaiApiKey: process.env.OPENAI_API_KEY!,
mcpServerCommand: process.env.MCP_SERVER_COMMAND!,
});
await client.connect();
const response = await client.processQuery(message, (thinking) => {
// Handle thinking steps
});
await client.cleanup();
return Response.json({ response });
}
`API Reference
See EXAMPLES.md for detailed examples.
Development
$3
`bash
Build with automatic patch version bump (1.0.0 → 1.0.1)
npm run buildBuild without version bump (development)
npm run build:no-versionBump minor version (1.0.0 → 1.1.0) for new features
npm run version:minorBump major version (1.0.0 → 2.0.0) for breaking changes
npm run version:majorCreate release tarball
npm run release
`Versioning: Every
npm run build` automatically increments the patch version.