Plugin SDK for 302.AI Studio - Build plugins for AI chat application
npm install @302ai/studio-plugin-sdkPlugin SDK for 302.AI Studio - Build powerful AI provider plugins with ease.


The 302.AI Studio Plugin SDK allows developers to create custom AI provider plugins that integrate seamlessly with the 302.AI Studio desktop application. Build plugins to add support for new AI providers, customize message processing, or extend functionality with hooks.
- šÆ Type-Safe API - Full TypeScript support with comprehensive type definitions
- š§© BaseProviderPlugin - Abstract base class with common utilities
- šŖ Hook System - Intercept and modify messages, responses, and errors
- š¾ Storage API - Persist plugin configuration and data
- š HTTP Client - Built-in authenticated HTTP requests
- šØ UI Integration - Show notifications, dialogs, and custom components
- š Logging - Structured logging with plugin context
- š i18n Support - Built-in internationalization capabilities
``bashUsing npm
npm install @302ai/studio-plugin-sdk
Quick Start
$3
`typescript
import { BaseProviderPlugin, type Model, type ModelProvider } from "@302ai/studio-plugin-sdk";export class MyProviderPlugin extends BaseProviderPlugin {
protected providerId = "my-provider";
protected providerName = "My AI Provider";
protected apiType = "openai";
protected defaultBaseUrl = "https://api.example.com/v1";
protected websites = {
official: "https://example.com",
apiKey: "https://example.com/api-keys",
docs: "https://docs.example.com",
models: "https://docs.example.com/models",
};
async onFetchModels(provider: ModelProvider): Promise {
const url = this.buildApiUrl(provider, "models");
const response = await this.httpRequest<{ data: any[] }>(url, {
method: "GET",
provider,
});
return response.data.map((model) => ({
id: model.id,
name: model.name,
remark:
${this.providerName} ${model.id},
providerId: this.providerId,
capabilities: this.parseModelCapabilities(model.id),
type: "language",
custom: false,
enabled: true,
collected: false,
}));
}
}export default MyProviderPlugin;
`$3
Create a
plugin.json file in your plugin directory:`json
{
"id": "com.example.my-provider",
"name": "My AI Provider",
"version": "1.0.0",
"type": "provider",
"description": "Integration with My AI Provider API",
"author": "Your Name",
"permissions": ["network", "storage"],
"compatibleVersion": ">=1.0.0",
"main": "main/index.js",
"builtin": false,
"configSchema": {
"type": "object",
"properties": {
"apiKey": {
"type": "string",
"title": "API Key",
"description": "Your API key for authentication"
}
},
"required": ["apiKey"]
}
}
`Core Concepts
$3
The
BaseProviderPlugin abstract class provides:- Authentication - Default API key validation
- HTTP Utilities - Authenticated requests with proper headers
- Model Parsing - Capability and type detection
- Error Handling - Common error scenarios (401, 429, timeout)
- Logging & Notifications - Built-in logging and user notifications
Required Methods:
-
onFetchModels(provider: ModelProvider): Promise - Fetch available modelsOptional Overrides:
-
getIconUrl() - Custom provider icon
- getConfigSchema() - Custom configuration schema
- testConnection(provider) - Connection validation
- getAuthHeaders(provider) - Custom authentication headers$3
Plugins can implement hooks to customize behavior:
#### onBeforeSendMessage
Modify messages before sending to the AI:
`typescript
async onBeforeSendMessage(context: MessageContext): Promise {
// Add system prompt
context.messages.unshift({
role: "system",
content: "You are a helpful assistant.",
});
return context;
}
`#### onAfterSendMessage
Process responses after receiving:
`typescript
async onAfterSendMessage(context: MessageContext, response: AIResponse): Promise {
this.log("info", Used ${response.usage?.totalTokens} tokens);
}
`#### onStreamChunk
Modify streaming response chunks:
`typescript
async onStreamChunk(chunk: StreamChunk): Promise {
if (chunk.type === "text" && chunk.text) {
chunk.text = chunk.text.toUpperCase(); // Example modification
}
return chunk;
}
`#### onError
Handle errors with retry logic:
`typescript
async onError(context: ErrorContext): Promise {
if (context.error.message.includes("429")) {
return {
handled: true,
retry: true,
retryDelay: 5000,
message: "Rate limit exceeded. Retrying in 5 seconds...",
};
}
return { handled: false };
}
`$3
The
PluginAPI is injected during initialization:#### Storage
`typescript
// Configuration (visible in UI)
await this.api.storage.setConfig("apiKey", "sk-...");
const apiKey = await this.api.storage.getConfig("apiKey");// Private data (not visible in UI)
await this.api.storage.setData("cache", { timestamp: Date.now() });
const cache = await this.api.storage.getData("cache");
`#### HTTP Client
`typescript
// GET request
const models = await this.api.http.get("https://api.example.com/models");// POST request with body
const result = await this.api.http.post("https://api.example.com/chat", {
messages: [...],
});
`#### UI Integration
`typescript
// Show notification
this.api.ui.showNotification("Model loaded successfully", "success");// Show dialog
const result = await this.api.ui.showDialog({
title: "Confirm Action",
message: "Are you sure?",
type: "question",
buttons: ["Yes", "No"],
});
`#### Logging
`typescript
this.api.logger.debug("Debug information");
this.api.logger.info("Informational message");
this.api.logger.warn("Warning message");
this.api.logger.error("Error message");
`Advanced Usage
$3
Override
getAuthHeaders for custom auth:`typescript
protected getAuthHeaders(provider: ModelProvider): Record {
return {
"X-API-Key": provider.apiKey,
"X-Custom-Header": "value",
};
}
`$3
Customize model capability parsing:
`typescript
protected parseModelCapabilities(modelId: string): Set {
const capabilities = new Set(); if (modelId.includes("vision")) {
capabilities.add("vision");
}
if (modelId.includes("function")) {
capabilities.add("function_call");
}
return capabilities;
}
`$3
For full control, implement
ProviderPlugin directly:`typescript
import type { ProviderPlugin, PluginAPI } from "@302ai/studio-plugin-sdk";export class CustomPlugin implements ProviderPlugin {
api?: PluginAPI;
async initialize(api: PluginAPI): Promise {
this.api = api;
}
getProviderDefinition() {
return {
id: "custom",
name: "Custom Provider",
// ... other properties
};
}
async onAuthenticate(context) {
// Custom auth logic
}
async onFetchModels(provider) {
// Custom model fetching
}
}
`Type Reference
$3
-
Model - AI model definition
- ModelProvider - Provider configuration
- ChatMessage - Chat message structure
- PluginMetadata - Plugin metadata from plugin.json$3
-
MessageContext - Message hook context
- StreamChunk - Streaming response chunk
- AIResponse - Complete AI response
- ErrorContext - Error hook context
- AuthContext - Authentication hook context$3
-
PluginAPI - Main plugin API
- PluginStorageAPI - Storage operations
- PluginHttpAPI - HTTP client
- PluginUIAPI - UI operations
- PluginLoggerAPI - Logging utilitiesExamples
Check the
plugins/builtin/ directory in the main repository for complete examples:- OpenAI Plugin - Standard OpenAI API integration
- Anthropic Plugin - Claude models with custom headers
- Google Plugin - Gemini models with custom parsing
- Debug Plugin - Full hook implementation example
Publishing Your Plugin
$3
`
my-plugin/
āāā plugin.json # Plugin metadata
āāā main/
ā āāā index.ts # Main plugin code
āāā package.json # npm package config
āāā README.md # Plugin documentation
`$3
`json
{
"scripts": {
"build": "tsc && cp plugin.json dist/"
}
}
`$3
`bash
npm publish --access public
``Users can then install your plugin via URL in 302.AI Studio.
1. Use TypeScript - Full type safety and autocomplete
2. Test Thoroughly - Test authentication, model fetching, and message sending
3. Handle Errors - Implement proper error handling and retry logic
4. Log Appropriately - Use appropriate log levels for debugging
5. Document Config - Provide clear configuration schema and defaults
6. Version Compatibility - Specify compatible app versions in plugin.json
This SDK follows semantic versioning. The API is stable for v1.x releases.
MIT License - see LICENSE file for details
- š Documentation
- š Issue Tracker
- š¬ Discussions
Contributions are welcome! Please read our contributing guidelines before submitting PRs.
---
Built with ā¤ļø by 302.AI