Showing 1-20 of 448,754 packages
OpenAI Function Calling in Typescript using Zod
Together AI provider integration for Metorial - enables using Metorial tools with Together AI's language models through function calling.
Agentic AI Library specialized in LLM Function Calling
DeepSeek provider integration for Metorial - enables using Metorial tools with DeepSeek's language models through function calling.
OpenAI provider integration for Metorial - enables using Metorial tools with OpenAI's GPT models through function calling.
Mistral AI provider integration for Metorial - enables using Metorial tools with Mistral's language models through function calling.
Google (Gemini) provider integration for Metorial - enables using Metorial tools with Google's Gemini models through function calling.
Universal OpenAPI to LLM function calling schemas. Transform any Swagger/OpenAPI document into type-safe schemas for OpenAI, Claude, Qwen, and more.
XAI (Grok) provider integration for Metorial - enables using Metorial tools with XAI's Grok models through function calling.
OpenAI Function calling tools
Memory tools for AI SDK and OpenAI function calling with supermemory
Agentic AI Library specialized in LLM Function Calling
Agentic AI Library specialized in LLM Function Calling
Agentic AI Library specialized in LLM Function Calling
Function json schema for function calling.
The OpenAI provider for Composio SDK, providing seamless integration with OpenAI's models and function calling capabilities.
Extensible framework for creating and managing LLM function calling
- Convert a **stream of token** into a **parsable JSON** object before the stream ends. - Implement **Streaming UI** in **LLM**-based AI application. - Leverage **OpenAI Function Calling** for early stream processing. - Parse **JSON stream** into distinct
The Stripe Agent Toolkit enables popular agent frameworks including LangChain and Vercel's AI SDK to integrate with Stripe APIs through function calling. It also provides tooling to quickly integrate metered billing for prompt and completion token usage.
TypeScript LLM client with streaming tool execution. Tools fire mid-stream. Built-in function calling works with any model—no structured outputs or native tool support required.