Production-ready AI Assistant Server with advanced RAG (chunk-based semantic search + adjacent retrieval), conversation management, real-time communication, and human agent handoff
npm install @vezlo/assistant-server 
π Production-ready Node.js/TypeScript API server for the Vezlo AI Assistant platform - Complete backend APIs with advanced RAG (chunk-based semantic search + adjacent retrieval), Docker deployment, and database migrations.
π Changelog | π Report Issue | π¬ Discussions
- Database Schema: New vezlo_knowledge_chunks table and RPC functions
- Embedding Model: Upgraded to text-embedding-3-large (3072 dimensions)
- Migration: Automatic via npm run migrate:latest (migration 006)
- Rollback: Supported via npm run migrate:rollback
Upgrade Steps:
``bash`
npm install @vezlo/assistant-server@latest
npm run migrate:latest
See CHANGELOG.md for complete migration guide.
---
- Backend APIs - RESTful API endpoints for AI chat and knowledge management
- AI Response Validation - LLM-as-Judge validation with developer/user modes via @vezlo/ai-validator
- Real-time Communication - WebSocket support for live chat with Supabase Realtime broadcasting
- Human Agent Handoff - Agent join/leave workflows with realtime status updates and message synchronization
- Advanced RAG System - Chunk-based semantic search with adjacent retrieval using OpenAI text-embedding-3-large (3072 dims) and pgvector
- Conversation Management - Persistent conversation history with agent support
- Database Tools - Connect external Supabase databases for natural language data queries (see docs)
- Slack Integration - Direct query bot with full AI responses, conversation history, and reaction-based feedback (setup guide)
- Feedback System - Message rating and improvement tracking
- Database Migrations - Knex.js migration system for schema management
- Production Ready - Docker containerization with health checks
`bashInstall globally
npm install -g @vezlo/assistant-server
$3
`bash
git clone https://github.com/vezlo/assistant-server.git
cd assistant-server
npm install
`πͺ Vercel Marketplace Integration
π Recommended for Vercel Users - Deploy with automated setup:

The Vercel Marketplace integration provides:
- Guided Configuration - Step-by-step setup wizard
- Automatic Environment Setup - No manual configuration needed
- Database Migration - Automatic table creation
- Production Optimization - Optimized for Vercel's serverless platform
Learn more about the marketplace integration β
π Quick Start (Interactive Setup)
$3
- Node.js 20+ and npm 9+
- Supabase project
- OpenAI API key$3
The fastest way to get started is with our interactive setup wizard:
`bash
If installed globally
vezlo-setupIf installed locally
npx vezlo-setupOr if cloned from GitHub
npm run setup
`The wizard will guide you through:
1. Supabase Configuration - URL, Service Role Key, DB host/port/name/user/password (with defaults)
2. OpenAI Configuration - API key, model, temperature, max tokens
3. Validation (nonβblocking) - Tests Supabase API and DB connectivity
4. Migrations - Runs Knex migrations if DB validation passes; otherwise shows how to run later
5. Environment - Generates
.env (does not overwrite if it already exists)
6. Default Data Seeding - Creates default admin user and company
7. API Key Generation - Generates API key for the default companyAfter setup completes, start the server:
`bash
vezlo-server
`$3
If you prefer manual configuration:
#### 1. Create Environment File
`bash
Copy example file
cp env.example .envEdit with your credentials
nano .env
`#### 2. Configure Database
Get your Supabase credentials from:
- Dashboard β Settings β API
- Database β Settings β Connection string
`env
Supabase Configuration
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_ANON_KEY=your-anon-key
SUPABASE_SERVICE_KEY=your-service-role-keyDatabase Configuration for Migrations
SUPABASE_DB_HOST=db.your-project.supabase.co
SUPABASE_DB_PORT=5432
SUPABASE_DB_NAME=postgres
SUPABASE_DB_USER=postgres
SUPABASE_DB_PASSWORD=your-database-passwordOpenAI Configuration
OPENAI_API_KEY=sk-your-api-key
AI_MODEL=gpt-4oMigration Security
MIGRATION_SECRET_KEY=your-secure-migration-key-hereAI Response Validation (Optional)
AI_VALIDATION_ENABLED=falseDeveloper Mode (Optional)
true = Strict code grounding for technical queries
false = User-friendly generic responses
DEVELOPER_MODE=false
`#### 3. Run Database Migrations (Recommended)
`bash
Using Knex migrations (primary method)
npm run migrate:latestOr via API after server is running
curl "http://localhost:3000/api/migrate?key=$MIGRATION_SECRET_KEY"
`#### 4. Create Default Admin & Generate API Key
`bash
Create default admin user and company (if not exists)
npm run seed-defaultSeed AI settings for existing companies (optional, auto-created for new companies)
npm run seed-ai-settingsGenerate API key for library integration
npm run generate-key
`Optional fallback (not recommended if using migrations):
`bash
Run raw SQL in Supabase Dashboard β SQL Editor
cat database-schema.sql
`#### 5. Validate Setup
`bash
Verify database connection and tables
vezlo-validateOr with npm
npm run validate
`#### 6. Start Server
`bash
If installed globally
vezlo-serverIf installed locally
npx vezlo-serverOr from source
npm run build && npm start
`$3
1. Copy the environment template and fill in your Supabase/OpenAI values:
`bash
cp env.example .env
# edit .env with your credentials before continuing
`
2. Build and start the stack:
`bash
docker-compose build
docker-compose up -d
`
The entrypoint runs migrations, seeds the default org/admin, and generates an API key automatically.
3. View container logs:
`bash
docker-compose logs -f vezlo-server
`βοΈ Vercel Deployment
Deploy to Vercel's serverless platform with multiple options. The Marketplace integration collects your credentials during configuration and sets environment variables automatically.
$3
π Deploy via Vercel Marketplace - Automated setup with guided configuration:

Benefits:
- β
Guided Setup - Step-by-step configuration wizard
- β
Automatic Environment Variables - No manual env var configuration needed
- β
Database Migration - Automatic table creation and schema setup
- β
Production Ready - Optimized for Vercel's serverless platform
After Installation:
1. Run the migration URL:
https://your-project.vercel.app/api/migrate?key=YOUR_MIGRATION_SECRET
2. Verify deployment: https://your-project.vercel.app/health
3. Access API docs: https://your-project.vercel.app/docs
$3

This will:
- Fork the repository to your GitHub
- Create a Vercel project
- Require marketplace integration setup
- Deploy automatically
$3
`bash
Install Vercel CLI
npm i -g vercelDeploy
vercelFollow prompts to configure
`$3
1. Supabase project (URL, Service Role key, DB host/port/name/user/password)
2. OpenAI API key
3. If not using the Marketplace, add environment variables in Vercel project settings
4. Disable Vercel Deployment Protection if the API needs to be publicly accessible; otherwise Vercel shows its SSO page and the browser never reaches your server.
See docs/VERCEL_DEPLOYMENT.md for detailed deployment guide.
π§ Environment Configuration
Edit
.env file with your credentials:`bash
REQUIRED - Supabase Configuration
SUPABASE_URL=https://your-project-id.supabase.co
SUPABASE_SERVICE_KEY=your-service-role-keyREQUIRED - Database Configuration for Knex.js Migrations
SUPABASE_DB_HOST=db.your-project.supabase.co
SUPABASE_DB_PORT=5432
SUPABASE_DB_NAME=postgres
SUPABASE_DB_USER=postgres
SUPABASE_DB_PASSWORD=your-database-passwordREQUIRED - OpenAI Configuration
OPENAI_API_KEY=sk-your-openai-api-key
AI_MODEL=gpt-4o
AI_TEMPERATURE=0.7
AI_MAX_TOKENS=1000REQUIRED - Database Migration Security
MIGRATION_SECRET_KEY=your-secure-migration-key-hereREQUIRED - Authentication
JWT_SECRET=your-super-secret-jwt-key-here-change-this-in-production
DEFAULT_ADMIN_EMAIL=admin@vezlo.org
DEFAULT_ADMIN_PASSWORD=admin123OPTIONAL - Server Configuration
PORT=3000
NODE_ENV=production
LOG_LEVEL=infoOPTIONAL - CORS Configuration
CORS_ORIGINS=http://localhost:3000,http://localhost:5173OPTIONAL - Swagger Base URL
BASE_URL=http://localhost:3000OPTIONAL - Rate Limiting
RATE_LIMIT_WINDOW=60000
RATE_LIMIT_MAX=100OPTIONAL - Organization Settings
ORGANIZATION_NAME=Vezlo
ASSISTANT_NAME=Vezlo AssistantOPTIONAL - Knowledge Base (uses text-embedding-3-large, 3072 dims)
CHUNK_SIZE=1000
CHUNK_OVERLAP=200
`π§ CLI Commands
The package provides these command-line tools:
$3
Interactive setup wizard that guides you through configuration.`bash
vezlo-setup
`$3
Creates default admin user and company.`bash
vezlo-seed-default
`$3
Seeds or updates AI settings for all existing companies with default values.`bash
vezlo-seed-ai-settings
`$3
Generates API key for the default admin's company. The API key is used by src-to-kb library.`bash
vezlo-generate-key
`$3
Validates database connection and verifies all tables exist.`bash
vezlo-validate
`$3
Starts the API server.`bash
vezlo-server
`π API Documentation
$3
`
http://localhost:3000/api
`$3
- Swagger UI: http://localhost:3000/docs
- Health Check: http://localhost:3000/health$3
#### Conversations
-
POST /api/conversations - Create new conversation (public widget endpoint)
- GET /api/conversations - List company conversations (agent dashboard)
- GET /api/conversations/:uuid - Get conversation with messages
- DELETE /api/conversations/:uuid - Delete conversation
- POST /api/conversations/:uuid/join - Agent joins a conversation
- POST /api/conversations/:uuid/messages/agent - Agent sends a message
- POST /api/conversations/:uuid/close - Agent closes a conversation#### Messages
-
POST /api/conversations/:uuid/messages - Create user message
- POST /api/messages/:uuid/generate - Generate AI response#### Knowledge Base
-
POST /api/knowledge/items - Create knowledge item (supports raw content, pre-chunked data, or chunks with embeddings)
- GET /api/knowledge/items - List knowledge items
- GET /api/knowledge/items/:uuid - Get knowledge item
- PUT /api/knowledge/items/:uuid - Update knowledge item
- DELETE /api/knowledge/items/:uuid - Delete knowledge itemKnowledge Ingestion Options:
- Raw Content: Send
content field, server creates chunks and embeddings
- Pre-chunked: Send chunks array with hasEmbeddings: false, server generates embeddings
- Chunks + Embeddings: Send chunks array with embeddings and hasEmbeddings: true, server stores directly#### Database Migrations
-
GET /api/migrate?key= - Run pending database migrations
- GET /api/migrate/status?key= - Check migration statusMigration Workflow:
1. Create Migration: Use
npm run migrate:make migration_name to create new migration files
2. Check Status: Use /api/migrate/status to see pending migrations
3. Run Migrations: Use /api/migrate to execute pending migrations remotelyMigration Endpoints Usage:
`bash
Check migration status
curl "http://localhost:3000/api/migrate/status?key=your-migration-secret-key"Run pending migrations
curl "http://localhost:3000/api/migrate?key=your-migration-secret-key"
`Required Environment Variable:
-
MIGRATION_SECRET_KEY - Secret key for authenticating migration requestsMigration Creation Example:
`bash
Create a new migration
npm run migrate:make add_users_tableThis creates: src/migrations/002_add_users_table.ts
Edit the file to add your schema changes
Then run via endpoint or command line
`#### Knowledge Search
-
POST /api/knowledge/search - Search knowledge base#### Feedback
-
POST /api/feedback - Submit message feedback (Public API)
- DELETE /api/feedback/:uuid - Delete/undo message feedback (Public API)#### API Keys (Admin Only)
-
POST /api/api-keys - Generate or update company API key
- GET /api/api-keys/status - Check if API key exists for company#### Team Management (Admin Only)
-
POST /api/companies/:companyUuid/team - Create team member
- GET /api/companies/:companyUuid/team - List team members (with pagination and search)
- PUT /api/companies/:companyUuid/team/:userUuid - Update team member (name, role, status, password)
- DELETE /api/companies/:companyUuid/team/:userUuid - Remove team member#### Account Settings
-
GET /api/account/profile - Get current user's profile
- PUT /api/account/profile - Update current user's name and password$3
- join-conversation - Join conversation room
- conversation:message - Real-time message updates㪠Conversation 2-API Flow
The conversation system follows the industry-standard 2-API flow pattern for AI chat applications:
$3
`bash
POST /api/conversations/{conversation-uuid}/messages
`
Purpose: Store the user's message in the conversation
Response: Returns the user message with UUID$3
`bash
POST /api/messages/{message-uuid}/generate
`
Purpose: Generate AI response based on the user message
Response: Returns the AI assistant's response$3
This pattern is the global recognized standard because:
β
Separation of Concerns
- User message storage is separate from AI generation
- Allows for message persistence even if AI generation fails
- Enables message history and conversation management
β
Reliability & Error Handling
- User messages are saved immediately
- AI generation can be retried independently
- Partial failures don't lose user input
β
Scalability
- AI generation can be queued/processed asynchronously
- Different rate limits for storage vs generation
- Enables streaming responses and real-time updates
β
Industry Standard
- Used by OpenAI, Anthropic, Google, and other major AI platforms
- Familiar pattern for developers
- Enables advanced features like message regeneration, threading, and branching
$3
`bash
1. User sends message
curl -X POST /api/conversations/abc123/messages \
-d '{"content": "How do I integrate your API?"}'
Response: {"uuid": "msg456", "content": "How do I integrate your API?", ...}
2. Generate AI response
curl -X POST /api/messages/msg456/generate \
-d '{}'
Response: {"uuid": "msg789", "content": "To integrate our API...", ...}
`ποΈ Database Setup
$3
Use the builtβin migration endpoints to create/upgrade tables:
`bash
Run pending migrations
curl "http://localhost:3000/api/migrate?key=your-migration-secret-key"Check migration status
curl "http://localhost:3000/api/migrate/status?key=your-migration-secret-key"
`These endpoints execute Knex migrations and keep schema versioned.
$3
If you prefer manual setup, run the SQL schema in Supabase SQL Editor:
`bash
View the schema SQL locally
cat database-schema.sqlCopy into Supabase Dashboard β SQL Editor and execute
`The
database-schema.sql contains all required tables and functions.π³ Docker Commands
`bash
Start services
docker-compose up -dView logs
docker-compose logs -f vezlo-serverStop services
docker-compose downRebuild and start
docker-compose up -d --buildView running containers
docker-compose psAccess container shell
docker exec -it vezlo-server sh
`π§ͺ Testing the API
$3
`bash
curl http://localhost:3000/health
`$3
`bash
1. Create conversation
CONV_UUID=$(curl -X POST http://localhost:3000/api/conversations \
-H "Content-Type: application/json" \
-d '{"title": "Test Conversation", "user_uuid": 12345, "company_uuid": 67890}' \
| jq -r '.uuid')2. Send user message
MSG_UUID=$(curl -X POST http://localhost:3000/api/conversations/$CONV_UUID/messages \
-H "Content-Type: application/json" \
-d '{"content": "Hello, how can you help me?"}' \
| jq -r '.uuid')3. Generate AI response
curl -X POST http://localhost:3000/api/messages/$MSG_UUID/generate \
-H "Content-Type: application/json" \
-d '{}'
`$3
`bash
curl -X POST http://localhost:3000/api/knowledge/search \
-H "Content-Type: application/json" \
-d '{
"query": "How to use the API?",
"limit": 5,
"threshold": 0.7,
"type": "hybrid"
}'
`π§ Development
$3
`bash
Install dependencies
npm installBuild TypeScript
npm run buildStart server (Node)
npm startOr start via CLI wrapper
npx vezlo-serverRun tests
npm test
`$3
`
vezlo/
βββ docs/ # Documentation
β βββ DEVELOPER_GUIDELINES.md
β βββ MIGRATIONS.md
βββ src/
β βββ config/ # Configuration files
β βββ controllers/ # API route handlers
β βββ middleware/ # Express middleware
β βββ schemas/ # API request/response schemas
β βββ services/ # Business logic services
β βββ storage/ # Database repositories
β βββ types/ # TypeScript type definitions
β βββ migrations/ # Database migrations
β βββ server.ts # Main application entry
βββ scripts/ # Utility scripts
βββ Dockerfile # Production container
βββ docker-compose.yml # Docker Compose configuration
βββ knexfile.ts # Database configuration
βββ env.example # Environment template
βββ package.json # Dependencies and scripts
βββ tsconfig.json # TypeScript configuration
`π Production Deployment
$3
Ensure all required environment variables are set:
- SUPABASE_URL and SUPABASE_SERVICE_KEY (required)
- SUPABASE_DB_HOST, SUPABASE_DB_PASSWORD (required for migrations)
- OPENAI_API_KEY (required)
- MIGRATION_SECRET_KEY (required for migration endpoints)
- JWT_SECRET (required for authentication)
- DEFAULT_ADMIN_EMAIL and DEFAULT_ADMIN_PASSWORD (required for initial setup)
- NODE_ENV=production
- CORS_ORIGINS (set to your domain)
- BASE_URL (optional, for custom Swagger server URL)$3
`bash
Build production image
docker build -t vezlo-server .Run production container
docker run -d \
--name vezlo-server \
-p 3000:3000 \
--env-file .env \
vezlo-server
`$3
- Health check endpoint: /health
- Docker health check configured
- Logs available in ./logs/ directory$3
`bash
Check migration status
curl "https://your-domain.com/api/migrate/status?key=your-migration-secret-key"Run pending migrations
curl "https://your-domain.com/api/migrate?key=your-migration-secret-key"
`π€ Contributing
$3
1. Fork the repository
2. Create feature branch: git checkout -b feature/new-feature
3. Make changes and test locally
4. Run tests: npm test
5. Commit: git commit -m 'Add new feature'
6. Push: git push origin feature/new-feature`- Developer Guidelines - Development workflow, coding standards, and best practices
- Database Migrations - Complete guide to Knex.js migration system
- API Documentation - Interactive Swagger documentation (when running)
This project is dual-licensed:
- Non-Commercial Use: Free under AGPL-3.0 license
- Commercial Use: Requires a commercial license - contact us for details
---
Status: β Production Ready | Version: 2.13.0 | Node.js: 20+ | TypeScript: 5+