SDK for building Auto Builder workflow plugins
npm install auto-builder-sdkLightweight TypeScript toolkit for authoring custom Auto-Builder workflow nodes (plugins).
* π Sandbox-first execution β nodes run inside a VM2 sandbox by default.
* π§© Tiny surface: just extend BaseNodeExecutor and export it.
* π Built-in telemetry & test/coverage enforcement.
* π οΈ npx auto-builder-sdk init scaffolds a ready-to-run plugin.
* ποΈ Database Service Integration β access databases through dependency injection without importing drivers directly.
* π Standard Pagination Utilities β comprehensive pagination support for consistent API and database interactions.
``bashcreate plugin folder & scaffold files
npx auto-builder-sdk init auto-builder-deskera-integration
cd my-awesome-plugin
Installation
`bash
npm install auto-builder-sdk
`Anatomy of a plugin
`
my-awesome-plugin/
ββ src/
β ββ index.ts # exports your node classes
ββ package.json # flagged with "auto-builder-plugin": true
ββ tsconfig.json
`---
Plugin manifest (
definePlugin) & node metadataEach SDK project exports one default plugin manifest created with
definePlugin(). The object is validated at build-time by Zod so you
can't accidentally publish an invalid plugin.`ts
export default definePlugin({
name: 'acme-erp', // npm name or folder name
version: '1.2.3',
main: './dist/index.js', // compiled entry file
nodes: ['CreateOrder', 'Ping'],
engines: { 'auto-builder': '^1.0.0' },
sandbox: { timeoutMs: 20_000 }, // optional overrides
});
`Inside a node you can expose extra metadata used by the builder UI:
`ts
static readonly definition = {
displayName: 'Create Order', // shown in node palette
icon: 'shopping-cart', // lucide icon name
group: ['action'],
version: 1,
description: 'Create a new order in Acme ERP',
category: 'action',
inputs: ['main'], outputs: ['main'],
properties: [ / β¦ / ],
};
`Only
properties is mandatory β everything else falls back to reasonable
defaults if omitted.$3
`ts
import { BaseNodeExecutor, definePlugin } from 'auto-builder-sdk';export class HelloNode extends BaseNodeExecutor {
static type = 'hello.world'; // machine id (shown in UI)
readonly nodeType = HelloNode.type;
async execute(node, input, ctx) {
return [
{ json: { message:
Hello ${ctx.workflowId}! }, binary: {} },
];
}
}export default definePlugin({
name: 'my-awesome-plugin',
version: '0.0.1',
main: './dist/index.js',
nodes: ['HelloNode'],
engines: { 'auto-builder': '^1.0.0' },
});
`$3
Auto-Builder supports dynamic dropdowns in the property inspector.
Define them by adding
typeOptions.loadOptionsMethod to a property and
implement a static (or instance) method on your executor class.`ts
export class HelloNode extends BaseNodeExecutor {
static readonly type = 'hello.world';
readonly nodeType = HelloNode.type; // 1οΈβ£ Loader method β can be async and may use credentials/params
static async listGreetings(): Promise> {
return [
{ name: 'Hi', value: 'hi' },
{ name: 'Hello', value: 'hello' },
{ name: 'Howdy', value: 'howdy' },
];
}
// 2οΈβ£ Reference the method in the property definition
static readonly definition = {
inputs: ['main'], outputs: ['main'],
properties: [
{
displayName: 'Greeting',
name: 'greeting',
type: 'options',
default: 'hello',
typeOptions: {
loadOptionsMethod: 'listGreetings',
},
},
],
} as const;
}
`The engine discovers the method automatically β **no server-side changes
required**.
---
Debugging a plugin node
1. Compile with source-maps (
"sourceMap": true, "inlineSources": true in
tsconfig.json).
2. Start the backend in debug/watch mode:
`bash
NODE_OPTIONS="--enable-source-maps --inspect=9229" \
PLUGINS_ENABLED=true \
PLUGIN_WATCH=true \
npm run dev
`
3. Attach VS Code (Run β "Node.js attach" β localhost:9229). Set break-points
in your TypeScript sources β thanks to source-maps they will hit inside
the sandbox.
4. Disable the sandbox temporarily (faster debugging): * global:
PLUGIN_SAFE_MODE=false npm run dev
* per plugin: add { "sandbox": { "enabled": false } } in definePlugin().---
Logging & error handling
`ts
import { log, NodeOperationError, NodeApiError } from 'auto-builder-sdk';log.info('Fetching data', { url });
if (!apiKey) {
throw new NodeOperationError(node, 'Missing API key');
}
`* When the plugin runs inside Auto-Builder the SDK logger proxies to the
main Winston logger so your messages appear in the service log files and
monitoring dashboards.
In stand-alone tests the logger falls back to
console..
* Throwing either of the SDK error classes lets the engine classify the failure
(operation vs API) but is not mandatory β any Error works.---
Output formatting with
createFormattedResultThe SDK provides
createFormattedResult and createFormattedErrorResult helpers that automatically format node outputs with proper namespacing and metadata tracking.$3
* Data lineage tracking: Every result includes
internal_node_reference field containing the node ID for debugging and tracking data flow
* Prevents key conflicts: Namespaced format prevents data collisions when multiple nodes of the same type exist
* Preserves input data: All data from previous nodes is preserved in the output
* Consistent metadata: Automatically adds operation details, timestamps, and execution context$3
`ts
import { BaseNodeExecutor, createFormattedResult, createFormattedErrorResult } from 'auto-builder-sdk';export class MyApiNode extends BaseNodeExecutor {
static readonly type = 'myapi.fetch';
readonly nodeType = MyApiNode.type;
async execute(node, inputData, context) {
try {
// Your node logic here
const result = await fetchSomeData();
// Format the result - automatically adds internal_node_reference
return inputData.map((item, index) =>
createFormattedResult(item, result, context, index)
);
} catch (error) {
// Format error results - also includes internal_node_reference
return inputData.map((item, index) =>
createFormattedErrorResult(item, error as Error, context, index)
);
}
}
}
`$3
Success result:
`json
{
"json": {
"previousNodeData": "preserved",
"myapi_fetch": {
"data": { "result": "from API" },
"internal_node_reference": "node_abc123",
"_metadata": {
"operation": "fetch",
"timestamp": "2025-10-24T10:30:00.000Z",
"nodeId": "node_abc123",
"nodeType": "myapi.fetch",
"dataItemIndex": 0,
"workflowNodePosition": 1
}
}
},
"binary": {},
"pairedItem": { "item": 0 }
}
`Error result:
`json
{
"json": {
"previousNodeData": "preserved",
"myapi_fetch": {
"error": "API request failed",
"success": false,
"internal_node_reference": "node_abc123",
"_metadata": {
"operation": "fetch",
"failed": true,
"timestamp": "2025-10-24T10:30:00.000Z",
"nodeId": "node_abc123",
"nodeType": "myapi.fetch",
"dataItemIndex": 0,
"workflowNodePosition": 1
}
}
},
"binary": {},
"pairedItem": { "item": 0 }
}
`$3
`ts
// Legacy mode (flat structure, backward compatible)
createFormattedResult(item, result, context, index, { legacyMode: true });// Include both namespaced and flat formats for gradual migration
createFormattedResult(item, result, context, index, { includeBothFormats: true });
`$3
* π Debugging: Use
internal_node_reference to trace data back to specific node executions
* π Multiple instances: Handles multiple nodes of the same type (e.g., myapi_fetch_1, myapi_fetch_2)
* π Metadata tracking: Automatic timestamps, operation names, and execution context
* π‘οΈ Error handling: Consistent error structure with node tracking
* π Data lineage: Complete chain of which nodes processed the data---
Importing engine types
All public interfaces are bundled with the SDK β no need to install the
auto-builder package:`ts
import type { INode, IExecutionContext, INodeExecutionData } from 'auto-builder-sdk';
`The file lives at
auto-builder-sdk/dist/auto-builder-sdk/src/ab-types.d.ts and
is kept in sync with the backend on every release.---
$3
> Added in SDK 0.1.x β no engine changes required.
Nodes can declare the kind of credential(s) they need via the **credential
registry**. A credential definition is just a JSON-ish object that describes the
fields users must enter and an optional
validate() function that performs a
live API check.`ts
import { registerCredential, type CredentialDefinition } from 'auto-builder-sdk';const jiraApiToken: CredentialDefinition = {
name: 'jira', // identifier referenced by nodes
displayName: 'Jira Cloud',
properties: [
{ name: 'domain', displayName: 'Domain', type: 'string', required: true },
{ name: 'email', displayName: 'Email', type: 'string', required: true },
{ name: 'apiToken', displayName: 'API Token',type: 'password', required: true },
],
validate: async (data) => {
const { domain, email, apiToken } = data as Record;
const auth = 'Basic ' + Buffer.from(
${email}:${apiToken}).toString('base64');
const res = await fetch(${domain}/rest/api/3/myself, { headers:{Authorization:auth} });
if (!res.ok) throw new Error(Auth failed (${res.status}));
},
};registerCredential(jiraApiToken);
`Expose it on your node:
`ts
static readonly definition = {
// β¦
credentials: [ { name: 'jira', required: true } ],
};
`Multiple schemes? Just register multiple definitions (e.g.
jiraOAuth2,
jiraBasic) and list them all in credentials:; the builder UI will let users
pick the type they want.When the node runs you retrieve whatever credential the user selected:
`ts
const creds = await this.getCredentials(node.credentials.jira);
console.log(creds.type); // "jira", "jiraOAuth2", β¦
`No core-or UI-level changes are needed β adding a new credential definition is
as simple as shipping the file with your plugin.
$3
`ts
import { expect, it } from 'vitest';
import { HelloNode, makeStubContext, makeStubNode } from '../src';it('returns greeting', async () => {
const nodeImpl = new HelloNode();
const ctx = makeStubContext({ workflowId: 'wf-123' });
const nodeDef = makeStubNode(HelloNode.type);
const res = await nodeImpl.execute(nodeDef, [], ctx);
expect(res[0].json.message).toBe('Hello wf-123!');
});
`$3
The SDK exposes two utilities to remove boiler-plate when writing tests:
`ts
import { makeStubContext, makeStubNode } from 'auto-builder-sdk';const ctx = makeStubContext();
const nodeDef = makeStubNode('my.node');
`Both helpers accept a partial override so you can customise only the
fields you care about.
$3
Every scaffold includes a
vitest.config.ts and the matching
@vitest/coverage-v8 dev-dependency. Two npm-scripts are generated:`json
"test": "vitest", // watch mode β fast dev loop
"verify": "vitest run --coverage" // used by prepublishOnly & CI
`Security & sandboxing
* The Auto-Builder engine executes each plugin in a VM2 sandbox, limited by:
* Time-out (default 30 000 ms)
* Memory (default 64 MB)
* Per-plugin overrides via
sandbox field:`json
"sandbox": { "timeoutMs": 10000, "memoryMb": 32 }
`* Global flag
PLUGIN_SAFE_MODE=false (engine env) disables sandbox (dev only).$3
The SDK provides two ways to access databases from your plugins:
#### 1. Shared Prisma Client
The SDK exposes
getDb() which returns the same PrismaClient instance
the Auto-Builder backend already uses. That means plugin nodes can run SQL
without opening their own connection pools or adding the @prisma/client
dependency.`ts
import { getDb, BaseNodeExecutor } from 'auto-builder-sdk';export class ListUsersNode extends BaseNodeExecutor {
static readonly type = 'db.users.list';
readonly nodeType = ListUsersNode.type;
async execute() {
const db = getDb(); // shared PrismaClient
const rows = await db.user.findMany();
return [{ json: rows, binary: {} }];
}
}
`#### 2. Database Service (Recommended)
New in SDK 1.0.12: The Auto-Builder engine provides a comprehensive database service that supports multiple database types through dependency injection. This is the recommended approach as it provides better security, connection pooling, and centralized management.
`ts
import { getDatabaseService, BaseNodeExecutor, registerCredential } from 'auto-builder-sdk';// Register database credentials
registerCredential({
name: 'postgres', displayName: 'PostgreSQL',
properties: [
{ name: 'host', displayName: 'Host', type: 'string', required: true },
{ name: 'port', displayName: 'Port', type: 'number', required: true, default: 5432 },
{ name: 'database', displayName: 'Database', type: 'string', required: true },
{ name: 'username', displayName: 'Username', type: 'string', required: true },
{ name: 'password', displayName: 'Password', type: 'password', required: true },
],
});
export class PostgresQueryNode extends BaseNodeExecutor {
static readonly type = 'postgres.query';
readonly nodeType = PostgresQueryNode.type;
static readonly definition = {
credentials: [{ name: 'postgres', required: true }],
properties: [
{ name: 'sql', displayName: 'SQL Query', type: 'string', required: true }
],
} as const;
async execute(node) {
const { sql } = node.parameters as { sql: string };
const credentials = await this.getCredentials(node.credentials.postgres);
// Get the injected database service
const databaseService = getDatabaseService();
// Execute query through the service
const result = await databaseService.executeQuery(sql, {
type: 'postgres',
data: credentials.data
});
return [{
json: {
rows: result.rows,
rowCount: result.rowCount,
executionTime: result.executionTime
},
binary: {}
}];
}
}
`Supported Database Types:
- PostgreSQL (
postgres)
- MySQL (mysql)
- Oracle Database (oracle)
- Microsoft SQL Server (mssql)
- MongoDB (mongodb)
- Google BigQuery (bigquery)Benefits of Database Service:
- π Secure: No direct driver imports in sandboxed plugins
- π Connection Pooling: Centralized connection management
- π Monitoring: Built-in query logging and metrics
- π‘οΈ Validation: Automatic credential and query validation
- π Consistent: Same interface across all database types
Below are minimal examples for different databases using the legacy pattern (direct driver imports). Note: This approach is deprecated and may not work in sandboxed environments. Use the Database Service approach above instead.
#### PostgreSQL / MySQL (using
pg or mysql2) - Legacy Pattern> β οΈ Deprecated: This example uses direct driver imports which may not work in sandboxed environments. Use the Database Service approach shown above instead.
`ts
import { BaseNodeExecutor, registerCredential } from 'auto-builder-sdk';
import { createPool } from 'mysql2/promise'; // or pg / knexregisterCredential({
name: 'mysql', displayName: 'MySQL',
properties: [
{ name: 'host', displayName: 'Host', type: 'string', required: true },
{ name: 'port', displayName: 'Port', type: 'number', required: true, default: 3306 },
{ name: 'database', displayName: 'Database', type: 'string', required: true },
{ name: 'username', displayName: 'Username', type: 'string', required: true },
{ name: 'password', displayName: 'Password', type: 'password',required: true },
],
});
export class MySqlQueryNode extends BaseNodeExecutor {
static readonly type = 'mysql.query'; readonly nodeType = MySqlQueryNode.type;
static readonly definition = {
credentials: [{ name: 'mysql', required: true }],
properties: [{ name: 'sql', displayName: 'SQL', type: 'string', required: true }],
} as const;
async execute(node) {
const { sql } = node.parameters as { sql: string };
const creds = await this.getCredentials(node.credentials.mysql);
const pool = createPool({
host: creds.data.host,
port: creds.data.port,
user: creds.data.username,
password: creds.data.password,
database: creds.data.database,
});
const [rows] = await pool.query(sql);
await pool.end();
return [{ json: rows, binary: {} }];
}
}
`#### MongoDB (using
mongodb) - Legacy Pattern> β οΈ Deprecated: This example uses direct driver imports which may not work in sandboxed environments. Use the Database Service approach shown above instead.
`ts
import { MongoClient } from 'mongodb';
import { BaseNodeExecutor, registerCredential } from 'auto-builder-sdk';registerCredential({
name: 'mongo', displayName: 'MongoDB',
properties: [
{ name: 'uri', displayName: 'Connection URI', type: 'string', required: true },
{ name: 'database', displayName: 'Database', type: 'string', required: true },
],
});
export class MongoFindNode extends BaseNodeExecutor {
static readonly type = 'mongo.find'; readonly nodeType = MongoFindNode.type;
static readonly definition = {
credentials: [{ name: 'mongo', required: true }],
properties: [
{ name: 'collection', displayName: 'Collection', type: 'string', required: true },
{ name: 'query', displayName: 'Query (JSON)', type: 'string', required: true },
],
} as const;
async execute(node) {
const { collection, query } = node.parameters as any;
const creds = await this.getCredentials(node.credentials.mongo);
const client = await MongoClient.connect(creds.data.uri);
const docs = await client.db(creds.data.database)
.collection(collection)
.find(JSON.parse(query)).toArray();
await client.close();
return [{ json: docs, binary: {} }];
}
}
`#### Microsoft SQL Server (using
mssql)`ts
import sql from 'mssql';
import { BaseNodeExecutor, registerCredential } from 'auto-builder-sdk';registerCredential({
name: 'mssql', displayName: 'MS SQL Server',
properties: [
{ name: 'server', displayName: 'Server', type: 'string', required: true },
{ name: 'user', displayName: 'User', type: 'string', required: true },
{ name: 'password',displayName: 'Password',type: 'password',required: true },
{ name: 'database',displayName: 'Database',type: 'string', required: true },
],
});
export class MsSqlQueryNode extends BaseNodeExecutor {
static readonly type = 'mssql.query'; readonly nodeType = MsSqlQueryNode.type;
static readonly definition = {
credentials: [{ name: 'mssql', required: true }],
properties: [{ name: 'sql', displayName: 'SQL', type: 'string', required: true }],
} as const;
async execute(node) {
const { sql: statement } = node.parameters as { sql: string };
const creds = await this.getCredentials(node.credentials.mssql);
await sql.connect({
server: creds.data.server,
user: creds.data.user,
password: creds.data.password,
database: creds.data.database,
options: { encrypt: true, trustServerCertificate: true },
});
const result = await sql.query(statement);
await sql.close();
return [{ json: result.recordset, binary: {} }];
}
}
`#### Other engines / cloud warehouses
The pattern is identical for any future database. Two quick sketches:
β’ ClickHouse (columnar OLAP)
`ts
import { createClient } from '@clickhouse/client';
registerCredential({
name: 'clickhouse', displayName: 'ClickHouse',
properties: [
{ name: 'url', displayName: 'HTTP URL', type: 'string', required: true },
{ name: 'user', displayName: 'User', type: 'string' },
{ name: 'pass', displayName: 'Password', type: 'password' },
],
});// inside execute()
const ch = createClient({
host: creds.data.url,
username: creds.data.user,
password: creds.data.pass,
});
const rows = await ch.query({ query: sql, format: 'JSONEachRow' });
`β’ BigQuery (Google Cloud)
`ts
import { BigQuery } from '@google-cloud/bigquery';
registerCredential({
name: 'bigquery', displayName: 'BigQuery',
properties: [
{ name: 'projectId', displayName: 'Project ID', type: 'string', required: true },
{ name: 'jsonKey', displayName: 'Service Account JSON', type: 'string', required: true, typeOptions:{rows:8} },
],
});// inside execute()
const client = new BigQuery({
projectId: creds.data.projectId,
credentials: JSON.parse(creds.data.jsonKey),
});
const [rows] = await client.query(sql);
`Any engine that reuses the Postgres/MySQL wire-protocol (CockroachDB,
TimescaleDB, Aurora-PG/MySQL) can simply adopt the existing credential & node
without code changes.
> TipβFor databases with native Prisma support (PostgreSQL, MySQL,
> SQLite, SQL Server, MongoDB) you can also generate a separate client in your
> plugin and skip the manual driver code. Use
getDb() when you want to run
> queries against the engine's primary database.---
Deskera bearer-token helper (
ctx.getDeskeraToken())> New in SDK 0.2.x & Auto-Builder 1.1 β lazy, secure token retrieval
Each node receives an execution-context (
IExecutionContext) instance. The
engine now injects an async helper getDeskeraToken() that returns a
user-/tenant-scoped Bearer token which you can pass to Deskera's REST
APIs:`ts
interface IExecutionContext {
// β¦ existing fields β¦
getDeskeraToken?(): Promise; // optional for typing; always present at runtime
}
`$3
* π No secrets in plugins β the helper is resolved inside the core engine.
π·οΈ Tenant isolation β token is scoped to the current* workflow/user.
* π Caching β the first call fetches the token from IAM, subsequent calls
within the same workflow execution are served from memory.
$3
`ts
import { BaseNodeExecutor } from 'auto-builder-sdk';export class ListCustomersNode extends BaseNodeExecutor {
static readonly type = 'deskera.customers.list';
readonly nodeType = ListCustomersNode.type;
async execute(node, input, ctx) {
const token = await ctx.getDeskeraToken(); // β one line, done!
const res = await fetch(
${process.env.DESKERA_API_BASE_URL}/customers,
{ headers: { Authorization: Bearer ${token} } },
);
if (!res.ok) throw new Error(Deskera API ${res.status}); const customers = await res.json();
return [{ json: customers, binary: {} }];
}
}
`$3
Because BullMQ serialises job payloads, functions are stripped when we bake
them into test contexts. The SDK helper
makeStubContext() accepts an
override so you can stub the token call in tests:`ts
import { expect, it } from 'vitest';
import { makeStubContext, makeStubNode } from 'auto-builder-sdk';
import { ListCustomersNode } from '../src';it('retrieves customers', async () => {
const ctx = makeStubContext({
getDeskeraToken: async () => 'test-token',
});
const node = makeStubNode(ListCustomersNode.type);
const out = await new ListCustomersNode().execute(node, [], ctx);
expect(out[0].json).toBeTypeOf('object');
});
`No further changes are required on the engine side β the helper works in both
inline and branch (parallel) execution modes.
---
Sandboxing options per plugin
VM2 limits have sensible defaults (30 s / 64 MB). Override them globally via
env-vars or per plugin in the manifest:
`jsonc
{
"sandbox": {
"enabled": true, // false disables VM2 (DEV only)
"timeoutMs": 10000, // 10 seconds
"memoryMb": 32 // 32 MB
}
}
`Developers can temporarily set
PLUGIN_SAFE_MODE=false on the backend to turn
all sandboxes off while debugging.---
Resolving parameters β two options
Auto-Builder exposes the same template engine in two different ways so you can pick whatever fits your code best.
$3
When your class extends
BaseNodeExecutor, just use the built-in protected helper:`ts
import { BaseNodeExecutor } from 'auto-builder-sdk';export class GreetNode extends BaseNodeExecutor {
readonly nodeType = 'demo.greet';
async execute(node, input, ctx) {
// inherited from BaseNodeExecutor
const opts = this.resolveParameters(node.parameters, ctx);
return [
{ json: { greeting: opts.message }, binary: {} },
];
}
}
`$3
Need the same logic without subclassing? Import
ParameterResolver:`ts
import { ParameterResolver, type IExecutionContext } from 'auto-builder-sdk';const ctx: IExecutionContext & { itemIndex?: number } = {
executionId: 'EX123',
workflowId: 'WF001',
workflow: {} as any,
node: {} as any,
inputData: [{ json: { name: 'Jane' }, binary: {} }],
runIndex: 0,
itemIndex: 0,
mode: 'manual',
timezone: 'UTC',
variables: {},
};
const raw = {
subject: 'Welcome {{ $json.name }}!',
sentAt: '{{ $now }}',
};
const resolved = ParameterResolver.resolve(raw, ctx);
// β { subject: 'Welcome Jane!', sentAt: '2025-06-28T14:00:00.000Z' }
`Both methods share exactly the same implementation under the hood, so the resolved output is identical.
---
Standard Pagination Utilities
New in SDK 1.0.16: The SDK provides comprehensive pagination utilities to enforce consistent pagination across all nodes and plugins.
$3
`ts
import {
convertToLimitOffset,
validatePaginationParams,
createPaginationResponse,
PAGINATION_NODE_DEFINITION,
PAGINATION_SIZE_NODE_DEFINITION,
type StandardPaginationParams
} from 'auto-builder-sdk';export class ListUsersNode extends BaseNodeExecutor {
static readonly type = 'api.users.list';
readonly nodeType = ListUsersNode.type;
static readonly definition = {
properties: [
// Add standard pagination parameters
PAGINATION_NODE_DEFINITION,
PAGINATION_SIZE_NODE_DEFINITION,
// ... other properties
]
} as const;
async execute(node) {
// Extract and validate pagination parameters
const paginationParams = validatePaginationParams({
page: node.parameters.page,
pageSize: node.parameters.pageSize
});
// Convert to API-specific format
const { limit, offset } = convertToLimitOffset(
paginationParams.page,
paginationParams.pageSize
);
// Make API call with pagination
const response = await fetch(
/api/users?limit=${limit}&offset=${offset});
const data = await response.json(); // Create standard pagination response
return [createPaginationResponse(
data.users,
paginationParams.page,
paginationParams.pageSize,
data.total
)];
}
}
`$3
The pagination system is fully configurable through environment variables:
`bash
Default values
PAGINATION_DEFAULT_PAGE=1
PAGINATION_DEFAULT_PAGE_SIZE=100Validation limits
PAGINATION_MIN_PAGE=1
PAGINATION_MIN_PAGE_SIZE=1
PAGINATION_MAX_PAGE_SIZE=1000String processing limits
PAGINATION_MAX_STRING_LENGTH=15
`$3
`ts
import { PAGINATION_CONFIG, updatePaginationConfig, resetPaginationConfig } from 'auto-builder-sdk';// Access current configuration
console.log(PAGINATION_CONFIG.DEFAULT_PAGE_SIZE); // 100
console.log(PAGINATION_CONFIG.MAX_PAGE_SIZE); // 1000
// Update configuration at runtime
updatePaginationConfig({
DEFAULT_PAGE_SIZE: 50,
MAX_PAGE_SIZE: 500
});
// Reset to environment variable defaults
resetPaginationConfig();
`$3
`ts
import { buildApiPaginationParams, parseApiPaginationResponse } from 'auto-builder-sdk';// For Jira API
const jiraParams = buildApiPaginationParams(
{ page: 1, pageSize: 50 },
'jira'
);
// β { startAt: 0, maxResults: 50 }
// For SharePoint API
const sharepointParams = buildApiPaginationParams(
{ page: 2, pageSize: 100 },
'sharepoint'
);
// β { top: 100, skip: 100 }
// Parse API responses
const standardResponse = parseApiPaginationResponse(
jiraApiResponse,
'jira'
);
`$3
`ts
import { convertLegacyPagination } from 'auto-builder-sdk';// Convert old parameter names to standard format
const standardParams = convertLegacyPagination({
pageNo: 2,
limit: 25,
totalCount: 150
});
// β { page: 2, pageSize: 25, totalRecords: 150 }
`$3
`ts
import { convertToLimitOffset } from 'auto-builder-sdk';const { limit, offset } = convertToLimitOffset(page, pageSize);
const sql =
SELECT * FROM users LIMIT ${limit} OFFSET ${offset};
`$3
- π Consistent Interface: All nodes use the same
page/pageSize parameters
- π‘οΈ Validation: Automatic parameter validation with sensible defaults
- π§ API Flexibility: Support for REST, GraphQL, OData, Jira, SharePoint formats
- π Metadata: Automatic calculation of total pages, next/previous indicators
- π Backward Compatibility: Legacy parameter conversion support---
Temporary File Management
New in SDK 1.0.17: The SDK provides comprehensive temporary file management utilities for handling binary files across plugin nodes with automatic cleanup tracking.
$3
When nodes process binary files (PDFs, images, Excel files, etc.), they often need to:
- Store files temporarily between node executions
- Track which workflow/node created which files
- Clean up files after processing
- Preserve original file metadata
The SDK's temp file utilities provide a standardized, secure way to handle these scenarios.
$3
`ts
import {
createTempFile,
readTempFile,
deleteTempFile,
getTempFileMetadata,
BaseNodeExecutor
} from 'auto-builder-sdk';export class ProcessPdfNode extends BaseNodeExecutor {
static readonly type = 'file.pdf.process';
readonly nodeType = ProcessPdfNode.type;
async execute(node, inputData, context) {
try {
// Create a temporary file from buffer
const pdfBuffer = Buffer.from(inputData[0].json.fileData, 'base64');
const tempFile = await createTempFile(
pdfBuffer,
context.workflowId,
node.id,
'document.pdf',
{ documentType: 'invoice', processedBy: 'ProcessPdfNode' }
);
// File is now stored at: /tmp/auto-builder-temp/file-XXXXX/timestamp-document.pdf
// Metadata file is at: /tmp/auto-builder-temp/file-XXXXX/timestamp-document.pdf.meta.json
return [{
json: {
tempFilePath: tempFile.tempFilePath,
fileSize: tempFile.reference.size,
originalName: tempFile.reference.originalFileName,
createdAt: tempFile.reference.createdAt
},
binary: {}
}];
} catch (error) {
throw new NodeOperationError(node,
Failed to create temp file: ${error.message});
}
}
}
`$3
`ts
export class ReadPdfNode extends BaseNodeExecutor {
static readonly type = 'file.pdf.read';
readonly nodeType = ReadPdfNode.type; async execute(node, inputData, context) {
const tempFilePath = inputData[0].json.tempFilePath;
// Read the file back as a buffer
const buffer = await readTempFile(tempFilePath);
// Get file metadata
const metadata = await getTempFileMetadata(tempFilePath);
return [{
json: {
content: buffer.toString('base64'),
size: buffer.length,
metadata: metadata
},
binary: {}
}];
}
}
`$3
`ts
export class CleanupNode extends BaseNodeExecutor {
static readonly type = 'file.cleanup';
readonly nodeType = CleanupNode.type; async execute(node, inputData, context) {
const tempFilePath = inputData[0].json.tempFilePath;
// Delete the temp file and its metadata
await deleteTempFile(tempFilePath);
return [{
json: {
message: 'File cleaned up successfully',
deletedPath: tempFilePath
},
binary: {}
}];
}
}
`$3
`ts
import {
createTempFile,
readTempFile,
deleteTempFile,
BaseNodeExecutor,
createFormattedResult,
createFormattedErrorResult
} from 'auto-builder-sdk';export class PdfToTextNode extends BaseNodeExecutor {
static readonly type = 'file.pdf.totext';
readonly nodeType = PdfToTextNode.type;
async execute(node, inputData, context) {
let tempFilePath: string | undefined;
try {
// Step 1: Create temp file from input
const pdfData = Buffer.from(inputData[0].json.pdfBase64, 'base64');
const tempFile = await createTempFile(
pdfData,
context.workflowId,
node.id,
inputData[0].json.fileName || 'document.pdf',
{ operation: 'pdf-to-text', nodeType: this.nodeType }
);
tempFilePath = tempFile.tempFilePath;
// Step 2: Process the file (extract text from PDF)
const buffer = await readTempFile(tempFilePath);
const text = await this.extractTextFromPdf(buffer);
// Step 3: Clean up
await deleteTempFile(tempFilePath);
// Step 4: Return formatted result
return inputData.map((item, index) =>
createFormattedResult(
item,
{
text,
originalFileName: tempFile.reference.originalFileName,
processedAt: new Date().toISOString()
},
context,
index
)
);
} catch (error) {
// Clean up on error
if (tempFilePath) {
await deleteTempFile(tempFilePath).catch(() => {});
}
return inputData.map((item, index) =>
createFormattedErrorResult(item, error as Error, context, index)
);
}
}
private async extractTextFromPdf(buffer: Buffer): Promise {
// Your PDF text extraction logic here
return 'Extracted text from PDF...';
}
}
`$3
Temp files are stored in a consistent directory structure:
`
/tmp/auto-builder-temp/
βββ file-abc123/
β βββ 1701234567890-document.pdf # Actual file
β βββ 1701234567890-document.pdf.meta.json # Metadata file
βββ file-def456/
β βββ 1701234568900-invoice.xlsx
β βββ 1701234568900-invoice.xlsx.meta.json
...
`$3
Each temp file has an associated
.meta.json file containing:`json
{
"workflowId": "wf-123",
"nodeId": "node-abc",
"originalFileName": "document.pdf",
"fileSize": 524288,
"createdAt": "2025-12-01T10:30:00.000Z",
"tempFilePath": "/tmp/auto-builder-temp/file-abc123/1701234567890-document.pdf",
"tempDir": "/tmp/auto-builder-temp/file-abc123",
"documentType": "invoice",
"processedBy": "ProcessPdfNode"
}
`$3
####
createTempFile(buffer, workflowId, nodeId, fileName, metadata?)Creates a temporary file with metadata tracking.
Parameters:
-
buffer: Buffer - Binary data to store
- workflowId: string - Workflow ID for tracking
- nodeId: string - Node ID for tracking
- fileName: string - Original filename (extension is preserved)
- metadata?: Record - Additional metadata (optional)Returns: Promise of:
`ts
{
tempFilePath: string; // Full path to temp file
tempDir: string; // Directory containing the file
metadataPath: string; // Path to metadata JSON file
reference: {
path: string; // Same as tempFilePath
size: number; // File size in bytes
originalFileName: string; // Original filename
createdAt: string; // ISO timestamp
workflowId: string; // Workflow ID
nodeId: string; // Node ID
};
}
`####
readTempFile(tempFilePath)Reads a temporary file and returns the buffer.
Parameters:
-
tempFilePath: string - Path to the temporary fileReturns:
Promise - Binary dataThrows: Error if file doesn't exist or can't be read
####
deleteTempFile(tempFilePath)Deletes a temporary file and its metadata.
Parameters:
-
tempFilePath: string - Path to the temporary fileReturns:
PromiseNote: Automatically deletes:
- The temp file itself
- The associated
.meta.json file
- The parent directory if empty####
getTempFileMetadata(tempFilePath)Gets metadata for a temporary file.
Parameters:
-
tempFilePath: string - Path to the temporary fileReturns:
Promise - Metadata object or null if not found$3
1. Always clean up: Use try-finally blocks to ensure temp files are deleted even if processing fails
`ts
let tempFilePath: string | undefined;
try {
const tempFile = await createTempFile(...);
tempFilePath = tempFile.tempFilePath;
// ... process file
} finally {
if (tempFilePath) {
await deleteTempFile(tempFilePath).catch(() => {});
}
}
`2. Include workflow context: Always pass
workflowId and nodeId for tracking3. Add custom metadata: Use the metadata parameter to track processing state, document types, etc.
4. Sanitize filenames: The utilities automatically sanitize filenames, but avoid very long names
5. Check file existence: Use
getTempFileMetadata() to verify file exists before reading$3
- π Organized Storage: Consistent directory structure (
/tmp/auto-builder-temp/)
- π·οΈ Metadata Tracking: Every file includes workflow/node context
- π§Ή Easy Cleanup: Single function deletes file, metadata, and empty directories
- π Debugging: File metadata includes creation time, workflow ID, node ID
- π‘οΈ Type Safety: Full TypeScript support with proper types
- π Logging: Built-in debug logging for file operations---
Telemetry hooks
You can subscribe to runtime metrics:
`ts
import { pluginTelemetry } from 'auto-builder/telemetry/plugin-telemetry';pluginTelemetry.on('metric', (m) => console.log(m));
`
Publishing & versioning workflow
1. Bump version (
npm version patch|minor|major). Follow semver:
β’ Patch β docs/typos & additive helpers.
β’ Minor β new capabilities (dynamic loaders, logger).
β’ Major β breaking API changes.
2. npm publish --access public (the prepublish script runs tests & coverage).
3. In your Auto-Builder deployment update the dependency:`bash
npm i auto-builder-sdk@latest # service repo
`> Tip: Use Renovate or dependabot so services stay in sync automatically.
---
Best practices
* Always write unit tests with β₯80 % coverage β enforced.
* Validate external inputs with
zod inside execute().
* Keep network/FS access minimal; prefer the SDK's helpers.
* Publish with semver and respect the peer range in engines`.MIT
---