Canonical parser and runtime for Design Token Interchange Format (DTIF) documents.
npm install @lapidist/dtif-parserCanonical parser and runtime for the Design Token Interchange Format (DTIF). The
package provides the reference pipeline for loading, validating, normalising,
and resolving DTIF documents while emitting structured diagnostics for tooling
and automation workflows.
> Documentation: Using the DTIF parser
``bash`
npm install @lapidist/dtif-parser
The package targets modern Node runtimes (v22+) and is published as a native ESM
module.
`ts
import { parseDocument } from '@lapidist/dtif-parser';
const result = await parseDocument('tokens.json');
for (const diagnostic of result.diagnostics) {
console.error(${diagnostic.severity}: ${diagnostic.message});
}
const resolved = result.resolver?.resolve('#/color/brand/primary');
console.log(resolved?.value);
`
To flatten tokens, collect metadata, and normalise diagnostics in a single step,
use the parseTokens helper. It loads the document, builds the dependency graph,
and returns resolved token snapshots alongside a flattened view of the document.
`ts
import { parseTokens } from '@lapidist/dtif-parser';
const { flattened, metadataIndex, resolutionIndex, diagnostics } = await parseTokens('tokens.json');
for (const token of flattened) {
console.log(token.pointer, token.value);
}
`
Pass onDiagnostic to observe parser diagnostics as they are produced and warnDiagnosticEvent
to intercept non-fatal issues. Both callbacks receive domain
objects, allowing you to format or surface them immediately without waiting for
the promise to resolve.
`ts`
await parseTokens('tokens.json', {
onDiagnostic: (diagnostic) => {
console.error(diagnostic.message);
},
warn: (diagnostic) => {
console.warn('[warn]', diagnostic.message);
}
});
Provide a TokenCache implementation, such as the built-inInMemoryTokenCache, to reuse flattening and resolution results across runs orparseTokensSync
for synchronous parsing with when your inputs are already
available in memory.
Create a session with createSession to reuse caches, install custom document
loaders, register plugins, or parse multiple collections with shared state.
The built-in DefaultDocumentLoader enforces a maximum document size tomaxBytes
protect tooling from accidentally loading unbounded payloads. The default limit
is 5 MiB. Provide the option when constructing a loader or
session to apply a stricter, positive byte threshold:
`ts
import { createSession, DefaultDocumentLoader } from '@lapidist/dtif-parser';
const session = createSession({
loader: new DefaultDocumentLoader({ maxBytes: 256 * 1024 })
});
`
Only finite, positive numbers are accepted. Any non-positive value (such as
0, negative numbers, NaN, or Infinity) is rejected with a RangeError atDocumentLoaderError
loader construction. When a payload exceeds the active limit, loading fails with
a whose reason is MAX_BYTES_EXCEEDED and whoselimit property reflects the enforced cap.
When enabling HTTP(S) loading, configure httpAllowedHosts to restrict whichhttpTimeoutMs
hosts may be fetched and to bound remote requests. Requests toDocumentLoaderError
non-allow-listed hosts fail with a whose reason isHTTP_HOST_NOT_ALLOWED, and hung requests are aborted with a TimeoutError
driven by the loader's timeout.
Each pipeline stage emits domain DiagnosticEvent objects instead of throwing.onDiagnostic
Results aggregate every diagnostic (including cache hits) so tooling can stream
warnings via /warn hooks, persist them for later inspection, orformatDiagnostic
format them with .
For Node-based tooling, import the bundled adapter to read DTIF token files from
disk with extension validation, formatted diagnostics, and ready-to-use token
documents:
`ts
import { parseTokensFromFile, readTokensFile } from '@lapidist/dtif-parser/adapters/node';
try {
const result = await parseTokensFromFile('tokens/base.tokens.json', {
onWarn: (message) => console.warn(message)
});
console.log(result.flattened.length);
} catch (error) {
// DtifTokenParseError exposes the normalised diagnostics for reporting
}
const document = await readTokensFile('tokens/base.tokens.json');
`
- createSession coordinates the loader, schema guard, normaliser, graphRawDocumentIdentity
builder, and resolver for each request. Sessions keep caches and plugins in
sync across parses.
- Domain caches receive keys and ensure decoded bytes, ASTparseTokens
snapshots, and flattened token artefacts can be reused safely between runs.
- Diagnostic events surface from every stage and persist in token cache entries
so warm parses provide the same visibility as cold runs.
- Helper APIs (, parseTokensSync, createMetadataSnapshot, andcreateResolutionSnapshot
) layer on snapshot builders without bypassing the
session lifecycle.
- Parser guide architecture section
documents the current module layout, session lifecycle, and testing
conventions that future roadmap work will build upon.
The workspace publishes a dtif-parse binary for quick inspection and CI
pipelines:
`bash`
dtif-parse tokens/base.tokens.json --resolve "#/color/brand/primary"
Use dtif-parse --help` for the full list of options and output formats.
MIT © Lapidist