An experimental and versatile library for exploring LLM-assisted metaprogramming.
npm install metaprog[![Contributors][contributors-shield]][contributors-url]
[![Forks][forks-shield]][forks-url]
[![Stargazers][stars-shield]][stars-url]
[![Issues][issues-shield]][issues-url]
Explore docs »
Suggest a feature
·
Report a bug
_Metaprog_ is an AI metaprogramming library for TypeScript that enables you to generate, validate and test code using _LLMs_ on runtime. It provides a simple yet powerful builder API to describe the code you want to generate and automatically handles the interaction with LLMs, validation of the output, and testing of the generated code.
- On-demand function generation based on a function description
- Integration with LLMs from the _LangChain_ ecosystem
- Automatic caching of generated functions to avoid re-generation
- Automated test and re-prompt process if a generated function fails a user-supplied test case
- Strong type-safety and flexible configuration for input and output schemas using _Zod_
You'll need to install the Metaprog package, as well as LangChain and the LLM-specific package you want to use. For the rest of the guide, we'll use Anthropic's Claude 3.5 Sonnet model.
``bash
npm install metaprog @langchain/core @langchain/anthropic # or any other LLM provider
pnpm add metaprog @langchain/core @langchain/anthropic
yarn add metaprog @langchain/core @langchain/anthropic
`
Below is a simple (and extremely overkill) example demonstrating how to generate a function that logs "Hello world!" to the console.
`typescript
import { createMetaprogBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';
const model = new ChatAnthropic({
model: 'claude-3-5-sonnet-latest',
apiKey: 'your_api_key_here',
});
const meta = createMetaprogBuilder({ model });
const func = await metaConsole log "Hello world!".build();
func(); // logs "Hello world!"
`
#### How It Works
1. You provide a textual description of what the function should do.
2. Metaprog sends this description (and optional schemas for input or output) to an LLM.
3. The LLM returns TypeScript code, which is then compiled and cached locally.
4. You can immediately invoke the compiled function within your application.
5. On subsequent runs, Metaprog checks the cache to avoid re-generation.
To further constrain or validate your function's input and output, you can provide Zod schemas. This will be used on the generation process as well as to strictly type the built function.
`typescript
import { z } from 'zod';
import { createMetaprogBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';
const model = new ChatAnthropic({
model: 'claude-3-5-sonnet-latest',
apiKey: 'your_api_key_here',
});
const meta = createMetaprogBuilder({ model });
// Define input/output Zod schemas
const pathFinder =
await metaGet shortest path between two nodes on a graph given an adjacency matrix, a start node, and an end node.
.input(
z.array(z.array(z.number())).describe('Adjacency matrix'),
z.number().describe('Start node'),
z.number().describe('End node'),
)
.output(z.number().describe('Shortest path length'))
.build();
// The function is strictly typed as:
// (adjacencyMatrix: number[][], startNode: number, endNode: number) => number
pathFinder(
[
[0, 1, 7],
[1, 2, 3],
[5, 3, 4],
],
0,
2,
); // 4
`
Metaprog can automatically run a test against the generated function. If the function fails, it will ask the LLM to fix the generated code and retry until it passes (up to a configurable number of retries).
`typescript
import { z } from 'zod';
import { createMetaprogBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';
const model = new ChatAnthropic({
model: 'claude-3-5-sonnet-latest',
apiKey: 'your_api_key_here',
});
const meta = createMetaprogBuilder({ model });
const addStrings = await metaAdd two numbers
.test((f) => f('1', '2') === 3) // If not passed, retries generation
.test((f) => f('-5', '15') === 10) // If not passed, retries generation
.build();
addStrings('1', '2'); // This result is ensured to be 3 as per the test
`
All generated functions are cached so that on subsequent runs, the same function doesn't need to be re-generated unnecesarily. This reduces both latency and usage quotas on your LLM. By default, files are stored under a "generated" folder, and metadata is stored in a JSON file.
#### Custom Cache Handler
If you want more control over how or where functions are stored, implement the CacheHandler interface:
`typescript
import { CacheHandler } from 'metaprog';
class MyCustomCacheHandler implements CacheHandler {
// Your cache handler code
}
// Then provide it to MetaprogFunctionBuilder:
import { MetaprogFunctionBuilder } from 'metaprog';
import { ChatAnthropic } from '@langchain/anthropic';
const model = new ChatAnthropic({
model: 'claude-3-5-sonnet-latest',
apiKey: 'your_api_key_here',
});
const myCustomCache = new MyCustomCacheHandler();
const myFunc = new MetaprogFunctionBuilder(
'Some descriptive text',
{ model },
myCustomCache,
);
``
Contributions are welcome! Feel free to submit issues or PRs on GitHub if you find bugs or want to propose new features.
This project is licensed under the MIT License. See the LICENSE file for details.
[contributors-shield]: https://img.shields.io/github/contributors/brainsaysno/metaprog.svg?style=for-the-badge&r
[contributors-url]: https://github.com/brainsaysno/metaprog/graphs/contributors?r
[forks-shield]: https://img.shields.io/github/forks/brainsaysno/metaprog.svg?style=for-the-badge&r
[forks-url]: https://github.com/brainsaysno/metaprog/network/members?r
[stars-shield]: https://img.shields.io/github/stars/brainsaysno/metaprog.svg?style=for-the-badge&r
[stars-url]: https://github.com/brainsaysno/metaprog/stargazers?r
[issues-shield]: https://img.shields.io/github/issues/brainsaysno/metaprog.svg?style=for-the-badge&r
[issues-url]: https://github.com/brainsaysno/metaprog/issues?r