A TypeScript client for the Apiary LLM Gateway
npm install @apiary-gateway/sdkApiary is an open-source LLM Gateway for managing requests across leading providers. It routes all LLM requests through a single API endpoint, simplifying access to multiple models with a unified API, centralized management of features like routing, caching, and guardrails, and built-in observability.
Use the Apiary TypeScript SDK in your applications to easily send client requests to multiple LLM providers.
* Anthropic
* Gemini
* OpenAI
The deployment process will provide an AiGatewayRestApiEndpoint URL and API key that you can use
to send LLM requests with the Apiary SDK.
To install the Apiary SDK, run the following command:
``bash`
npm install @apiary-gateway/sdk
Import the Apiary client and initialize an instance with your configuration details.
Call the chatCompletion method with a prompt (required). threadID
A may also be provided to include conversation context with the prompt.
Optionally, specify the LLM provider and model to override your configuration details for these fields.
`ts
import Apiary from '@apiary-gateway/sdk';
// Initialize the Apiary client
const client = new Apiary({
apiKey: '
baseUrl: 'https://
// Optionally include a userId to partition cache responses based on userId
userId: '
// Optionally include a metadata object with your custom routing configuration groups
metadata: {
user-type: 'pro-users'
},
});
const main = async () => {
const completion = await client.chatCompletion(
// Example prompt
'Do all bees make honey?',
// Optionally include a threadID to manage conversation context
'1745267450008',
// Optionally specify the provider and model to override your configuration defaults
'openai',
'gpt-4',
);
// Log the returned LLM response
console.log(completion.response.text);
// Log the threadID for the returned LLM response
console.log(completion.threadID);
}
main();
``