Promptbook: Turn your company's scattered knowledge into AI ready books
npm install @promptbook/utilsTurn your company's scattered knowledge into AI ready Books
](https://www.npmjs.com/package/promptbook)
](https://packagequality.com/#?package=promptbook)







- Gemini 3 Support
โ Warning: This is a pre-release version of the library. It is not yet ready for production use. Please look at latest stable release.
@promptbook/utils- Promptbooks are divided into several packages, all are published from single monorepo.
- This package @promptbook/utils is one part of the promptbook ecosystem.
To install this package, run:
``bashInstall entire promptbook ecosystem
npm i ptbk
Comprehensive utility functions for text processing, validation, normalization, and LLM input/output handling in the Promptbook ecosystem.
๐ฏ Purpose and Motivation
The utils package provides a rich collection of utility functions that are essential for working with LLM inputs and outputs. It handles common tasks like text normalization, parameter templating, validation, and postprocessing, eliminating the need to implement these utilities from scratch in every promptbook application.
๐ง High-Level Functionality
This package offers utilities across multiple domains:
- Text Processing: Counting, splitting, and analyzing text content
- Template System: Secure parameter substitution and prompt formatting
- Normalization: Converting text to various naming conventions and formats
- Validation: Comprehensive validation for URLs, emails, file paths, and more
- Serialization: JSON handling, deep cloning, and object manipulation
- Environment Detection: Runtime environment identification utilities
- Format Parsing: Support for CSV, JSON, XML validation and parsing
โจ Key Features
- ๐ Secure Templating - Prompt injection protection with template functions
- ๐ Text Analysis - Count words, sentences, paragraphs, pages, and characters
- ๐ Case Conversion - Support for kebab-case, camelCase, PascalCase, SCREAMING_CASE
- โ
Comprehensive Validation - Email, URL, file path, UUID, and format validators
- ๐งน Text Cleaning - Remove emojis, quotes, diacritics, and normalize whitespace
- ๐ฆ Serialization Tools - Deep cloning, JSON export, and serialization checking
- ๐ Environment Aware - Detect browser, Node.js, Jest, and Web Worker environments
- ๐ฏ LLM Optimized - Functions specifically designed for LLM input/output processing
Simple templating
The
prompt template tag function helps format prompt strings for LLM interactions. It handles string interpolation and maintains consistent formatting for multiline strings and lists and also handles a security to avoid prompt injection.`typescript
import { prompt } from '@promptbook/utils';const promptString = prompt
> ${unsecureUserInput};`
The prompt name could be overloaded by multiple things in your code. If you want to use the promptTemplate which is alias for prompt:
`typescript
import { promptTemplate } from '@promptbook/utils';
const promptString = promptTemplate
Correct the following sentence:
> ${unsecureUserInput};`
There is a function templateParameters which is used to replace the parameters in given template optimized to LLM prompt templates.
`typescript
import { templateParameters } from '@promptbook/utils';
templateParameters('Hello, {name}!', { name: 'world' }); // 'Hello, world!'
`
And also multiline templates with blockquotes
`typescript
import { templateParameters, spaceTrim } from '@promptbook/utils';
templateParameters(
spaceTrim(
Hello, {name}!
> {answer}
),
{
name: 'world',
answer: spaceTrim(
I'm fine,
thank you!
And you?
),
},
);
// Hello, world!
//
// > I'm fine,
// > thank you!
// >
// > And you?
`
These functions are useful to count stats about the input/output in human-like terms not tokens and bytes, you can use
countCharacters, countLines, countPages, countParagraphs, countSentences, countWords
`typescript
import { countWords } from '@promptbook/utils';
console.log(countWords('Hello, world!')); // 2
`
Splitting functions are similar to counting but they return the split parts of the input/output, you can use
splitIntoCharacters, splitIntoLines, splitIntoPages, splitIntoParagraphs, splitIntoSentences, splitIntoWords
`typescript
import { splitIntoWords } from '@promptbook/utils';
console.log(splitIntoWords('Hello, world!')); // ['Hello', 'world']
`
Normalization functions are used to put the string into a normalized form, you can use
kebab-casePascalCaseSCREAMING_CASEsnake_casekebab-case
`typescript
import { normalizeTo } from '@promptbook/utils';
console.log(normalizeTo'kebab-case'); // 'hello-world'
`
- There are more normalization functions like capitalize, decapitalize, removeDiacritics,...POSTPROCESS
- These can be also used as postprocessing functions in the command in promptbook
Sometimes you need to postprocess the output of the LLM model, every postprocessing function that is available through POSTPROCESS command in promptbook is exported from @promptbook/utils. You can use:
- spaceTrimextractAllBlocksFromMarkdown
- , _<- Note: Exported from @promptbook/markdown-utils_extractAllListItemsFromMarkdown
- _<- Note: Exported from @promptbook/markdown-utils_extractBlock
- extractOneBlockFromMarkdown
- _<- Note: Exported from @promptbook/markdown-utils_prettifyPipelineString
- removeMarkdownComments
- removeEmojis
- removeMarkdownFormatting
- _<- Note: Exported from @promptbook/markdown-utils_removeQuotes
- trimCodeBlock
- trimEndOfCodeBlock
- unwrapResult
-
Very often you will use unwrapResult, which is used to extract the result you need from output with some additional information:
`typescript
import { unwrapResult } from '@promptbook/utils';
unwrapResult('Best greeting for the user is "Hi Pavol!"'); // 'Hi Pavol!'
`
- BOOK_LANGUAGE_VERSION - Current book language versionPROMPTBOOK_ENGINE_VERSION
- - Current engine version
- VALUE_STRINGS - Standard value stringsSMALL_NUMBER
- - Small number constant
- renderPromptbookMermaid - Render promptbook as Mermaid diagram
- deserializeError - Deserialize error objectsserializeError
- - Serialize error objects
- forEachAsync - Async forEach implementation
- isValidCsvString - Validate CSV string formatisValidJsonString
- - Validate JSON string formatjsonParse
- - Safe JSON parsingisValidXmlString
- - Validate XML string format
- prompt - Template tag for secure prompt formattingpromptTemplate
- - Alias for prompt template tag
- $getCurrentDate - Get current date (side effect)$isRunningInBrowser
- - Check if running in browser$isRunningInJest
- - Check if running in Jest$isRunningInNode
- - Check if running in Node.js$isRunningInWebWorker
- - Check if running in Web Worker
- CHARACTERS_PER_STANDARD_LINE - Characters per standard line constantLINES_PER_STANDARD_PAGE
- - Lines per standard page constantcountCharacters
- - Count characters in textcountLines
- - Count lines in textcountPages
- - Count pages in textcountParagraphs
- - Count paragraphs in textsplitIntoSentences
- - Split text into sentencescountSentences
- - Count sentences in textcountWords
- - Count words in textCountUtils
- - Utility object with all counting functions
- capitalize - Capitalize first letterdecapitalize
- - Decapitalize first letterDIACRITIC_VARIANTS_LETTERS
- - Diacritic variants mappingstring_keyword
- - Keyword string type (type)Keywords
- - Keywords type (type)isValidKeyword
- - Validate keyword formatnameToUriPart
- - Convert name to URI partnameToUriParts
- - Convert name to URI partsstring_kebab_case
- - Kebab case string type (type)normalizeToKebabCase
- - Convert to kebab-casestring_camelCase
- - Camel case string type (type)normalizeTo_camelCase
- - Convert to camelCasestring_PascalCase
- - Pascal case string type (type)normalizeTo_PascalCase
- - Convert to PascalCasestring_SCREAMING_CASE
- - Screaming case string type (type)normalizeTo_SCREAMING_CASE
- - Convert to SCREAMING_CASEnormalizeTo_snake_case
- - Convert to snake_casenormalizeWhitespaces
- - Normalize whitespace charactersorderJson
- - Order JSON object propertiesparseKeywords
- - Parse keywords from inputparseKeywordsFromString
- - Parse keywords from stringremoveDiacritics
- - Remove diacritic markssearchKeywords
- - Search within keywordssuffixUrl
- - Add suffix to URLtitleToName
- - Convert title to name format
- spaceTrim - Trim spaces while preserving structure
- extractParameterNames - Extract parameter names from templatenumberToString
- - Convert number to stringtemplateParameters
- - Replace template parametersvalueToString
- - Convert value to string
- parseNumber - Parse number from string
- removeEmojis - Remove emoji charactersremoveQuotes
- - Remove quote characters
- $deepFreeze - Deep freeze object (side effect)checkSerializableAsJson
- - Check if serializable as JSONclonePipeline
- - Clone pipeline objectdeepClone
- - Deep clone objectexportJson
- - Export object as JSONisSerializableAsJson
- - Check if object is JSON serializablejsonStringsToJsons
- - Convert JSON strings to objects
- difference - Set difference operationintersection
- - Set intersection operationunion
- - Set union operation
- trimCodeBlock - Trim code block formattingtrimEndOfCodeBlock
- - Trim end of code blockunwrapResult
- - Extract result from wrapped output
- isValidEmail - Validate email address formatisRootPath
- - Check if path is root pathisValidFilePath
- - Validate file path formatisValidJavascriptName
- - Validate JavaScript identifierisValidPromptbookVersion
- - Validate promptbook versionisValidSemanticVersion
- - Validate semantic versionisHostnameOnPrivateNetwork
- - Check if hostname is on private networkisUrlOnPrivateNetwork
- - Check if URL is on private networkisValidPipelineUrl
- - Validate pipeline URL formatisValidUrl
- - Validate URL formatisValidUuid
- - Validate UUID format
> ๐ก This package provides utility functions for promptbook applications. For the core functionality, see @promptbook/core or install all packages with npm i ptbk
---
Rest of the documentation is common for entire promptbook ecosystem:
Nowadays, the biggest challenge for most business applications isn't the raw capabilities of AI models. Large language models such as GPT-5.2 and Claude-4.5 are incredibly capable.
The main challenge lies in managing the context, providing rules and knowledge, and narrowing the personality.
In Promptbook, you can define your context using simple Books that are very explicit, easy to understand and write, reliable, and highly portable.
Paul Smith |
We have created a language called Book, which allows you to write AI agents in their native language and create your own AI persona. Book provides a guide to define all the traits and commitments.
You can look at it as "prompting" _(or writing a system message)_, but decorated by commitments.
Commitments are special syntax elements that define contracts between you and the AI agent. They are transformed by Promptbook Engine into low-level parameters like which model to use, its temperature, system message, RAG index, MCP servers, and many other parameters. For some commitments _(for example RULE commitment)_ Promptbook Engine can even create adversary agents and extra checks to enforce the rules.
#### Persona commitment
Personas define the character of your AI persona, its role, and how it should interact with users. It sets the tone and style of communication.
Paul Smith & Associรฉs |
#### Knowledge commitment
Knowledge Commitment allows you to provide specific information, facts, or context that the AI should be aware of when responding.
This can include domain-specific knowledge, company policies, or any other relevant information.
Promptbook Engine will automatically enforce this knowledge during interactions. When the knowledge is short enough, it will be included in the prompt. When it is too long, it will be stored in vector databases and RAG retrieved when needed. But you don't need to care about it.
Paul Smith & Associรฉs |
#### Rule commitment
Rules will enforce specific behaviors or constraints on the AI's responses. This can include ethical guidelines, communication styles, or any other rules you want the AI to follow.
Depending on rule strictness, Promptbook will either propagate it to the prompt or use other techniques, like adversary agent, to enforce it.
Paul Smith & Associรฉs |
#### Team commitment
Team commitment allows you to define the team structure and advisory fellow members the AI can consult with. This allows the AI to simulate collaboration and consultation with other experts, enhancing the quality of its responses.
Paul Smith & Associรฉs |
!!!@@@
#### Promptbook Server
!!!@@@
#### Promptbook Engine
!!!@@@
Promptbook project is ecosystem of multiple projects and tools, following is a list of most important pieces of the project:
| Project | About |
|---|---|
| Agents Server | Place where you "AI agents live". It allows to create, manage, deploy, and interact with AI agents created in Book language. |
| Book language | Human-friendly, high-level language that abstracts away low-level details of AI. It allows to focus on personality, behavior, knowledge, and rules of AI agents rather than on models, parameters, and prompt engineering. There is also a plugin for VSCode to support .book file extension |
| Promptbook Engine | Promptbook engine can run AI agents based on Book language. It is released as multiple NPM packages and Promptbook Agent Server as Docker Package Agent Server is based on Promptbook Engine. |
Join our growing community of developers and users:
| Platform | Description |
|---|---|
| ๐ฌ Discord | Join our active developer community for discussions and support |
| ๐ฃ๏ธ GitHub Discussions | Technical discussions, feature requests, and community Q&A |
| ๐ LinkedIn | Professional updates and industry insights |
| ๐ฑ Facebook | General announcements and community engagement |
| ๐ ptbk.io | Official landing page with project information |
#### Promptbook.studio
| ๐ธ Instagram @promptbook.studio | Visual updates, UI showcases, and design inspiration |
See detailed guides and API reference in the docs or online.
For information on reporting security vulnerabilities, see our Security Policy.
This library is divided into several packages, all are published from single monorepo.
You can install all of them at once:
`bash`
npm i ptbk
Or you can install them separately:
> โญ Marked packages are worth to try first
- โญ ptbk - Bundle of all packages, when you want to install everything and you don't care about the size
- promptbook - Same as ptbk
- โญ๐งโโ๏ธ @promptbook/wizard - Wizard to just run the books in node without any struggle
- @promptbook/core - Core of the library, it contains the main logic for promptbooks
- @promptbook/node - Core of the library for Node.js environment
- @promptbook/browser - Core of the library for browser environment
- โญ @promptbook/utils - Utility functions used in the library but also useful for individual use in preprocessing and postprocessing LLM inputs and outputs
- @promptbook/markdown-utils - Utility functions used for processing markdown
- _(Not finished)_ @promptbook/wizard - Wizard for creating+running promptbooks in single line
- @promptbook/javascript - Execution tools for javascript inside promptbooks
- @promptbook/openai - Execution tools for OpenAI API, wrapper around OpenAI SDK
- @promptbook/anthropic-claude - Execution tools for Anthropic Claude API, wrapper around Anthropic Claude SDK
- @promptbook/vercel - Adapter for Vercel functionalities
- @promptbook/google - Integration with Google's Gemini API
- @promptbook/deepseek - Integration with DeepSeek API
- @promptbook/ollama - Integration with Ollama API
- @promptbook/azure-openai - Execution tools for Azure OpenAI API
- @promptbook/fake-llm - Mocked execution tools for testing the library and saving the tokens
- @promptbook/remote-client - Remote client for remote execution of promptbooks
- @promptbook/remote-server - Remote server for remote execution of promptbooks
- @promptbook/pdf - Read knowledge from .pdf documents.docx
- @promptbook/documents - Integration of Markitdown by Microsoft
- @promptbook/documents - Read knowledge from documents like , .odt,โฆ.doc
- @promptbook/legacy-documents - Read knowledge from legacy documents like , .rtf,โฆ
- @promptbook/website-crawler - Crawl knowledge from the web
- @promptbook/editable - Editable book as native javascript object with imperative object API
- @promptbook/templates - Useful templates and examples of books which can be used as a starting point
- @promptbook/types - Just typescript types used in the library
- @promptbook/color - Color manipulation library
- โญ @promptbook/cli - Command line interface utilities for promptbooks
- ๐ Docker image - Promptbook server
The following glossary is used to clarify certain concepts:
- Prompt drift is a phenomenon where the AI model starts to generate outputs that are not aligned with the original prompt. This can happen due to the model's training data, the prompt's wording, or the model's architecture.
- Pipeline, workflow scenario or chain is a sequence of tasks that are executed in a specific order. In the context of AI, a pipeline can refer to a sequence of AI models that are used to process data.
- Fine-tuning is a process where a pre-trained AI model is further trained on a specific dataset to improve its performance on a specific task.
- Zero-shot learning is a machine learning paradigm where a model is trained to perform a task without any labeled examples. Instead, the model is provided with a description of the task and is expected to generate the correct output.
- Few-shot learning is a machine learning paradigm where a model is trained to perform a task with only a few labeled examples. This is in contrast to traditional machine learning, where models are trained on large datasets.
- Meta-learning is a machine learning paradigm where a model is trained on a variety of tasks and is able to learn new tasks with minimal additional training. This is achieved by learning a set of meta-parameters that can be quickly adapted to new tasks.
- Retrieval-augmented generation is a machine learning paradigm where a model generates text by retrieving relevant information from a large database of text. This approach combines the benefits of generative models and retrieval models.
- Longtail refers to non-common or rare events, items, or entities that are not well-represented in the training data of machine learning models. Longtail items are often challenging for models to predict accurately.
_Note: This section is not a complete dictionary, more list of general AI / LLM terms that has connection with Promptbook_
- ๐ Collection of pipelines
- ๐ฏ Pipeline
- ๐โโ๏ธ Tasks and pipeline sections
- ๐คผ Personas
- โญ Parameters
- ๐ Pipeline execution
- ๐งช Expectations - Define what outputs should look like and how they're validated
- โ๏ธ Postprocessing - How outputs are refined after generation
- ๐ฃ Words not tokens - The human-friendly way to think about text generation
- โฏ Separation of concerns - How Book language organizes different aspects of AI workflows
| Data & Knowledge Management | Pipeline Control |
|---|---|
|
|
| Language & Output Control | Advanced Generation |
|
|
- When you are writing app that generates complex things via LLM - like websites, articles, presentations, code, stories, songs,...
- When you want to separate code from text prompts
- When you want to describe complex prompt pipelines and don't want to do it in the code
- When you want to orchestrate multiple prompts together
- When you want to reuse parts of prompts in multiple places
- When you want to version your prompts and test multiple versions
- When you want to log the execution of prompts and backtrace the issues
- When you have already implemented single simple prompt and it works fine for your job
- When OpenAI Assistant (GPTs) is enough for you
- When you need streaming _(this may be implemented in the future, see discussion)_.
- When you need to use something other than JavaScript or TypeScript _(other languages are on the way, see the discussion)_
- When your main focus is on something other than text - like images, audio, video, spreadsheets _(other media types may be added in the future, see discussion)_
- When you need to use recursion _(see the discussion)_
- ๐คธโโ๏ธ Iterations not working yet
- โคต๏ธ Imports not working yet
- โฟ No recursion
- ๐ณ There are no types, just strings
If you have a question start a discussion, open an issue or write me an email.
- โ Why not just use the OpenAI SDK / Anthropic Claude SDK / ...?
- โ How is it different from the OpenAIs GPTs?
- โ How is it different from the Langchain?
- โ How is it different from the DSPy?
- โ How is it different from _anything_?
- โ Is Promptbook using RAG _(Retrieval-Augmented Generation)_?
- โ Is Promptbook using function calling?
See CHANGELOG.md
This project is licensed under BUSL 1.1.
We welcome contributions! See CONTRIBUTING.md for guidelines.
You can also โญ star the project, follow us on GitHub or various other social networks.We are open to pull requests, feedback, and suggestions.
Need help with Book language? We're here for you!
- ๐ฌ Join our Discord community for real-time support
- ๐ Browse our GitHub discussions for FAQs and community knowledge
- ๐ Report issues for bugs or feature requests
- ๐ Visit ptbk.io for more resources and documentation
- ๐ง Contact us directly through the channels listed in our signpost
We welcome contributions and feedback to make Book language better for everyone!