Showing 1-20 of 513,854 packages
Typescript client for the Hugging Face Inference Providers and Inference Endpoints
Chatbot maker for HuggingFace Inference API and other AI API providers and backends.
The **[Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index)** for the [Vercel AI SDK](https://ai-sdk.dev/docs) contains language model support for thousands of models through multiple inference providers via the Hugging
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Utilities to interact with the Hugging Face hub
A minimalistic JavaScript implementation of the Jinja templating engine, specifically designed for parsing and rendering ML chat templates.
OpenAI adapters convert an OpenAI-compatible request to a request for another API and back.
The official TypeScript library for the Cerebras API
MediaPipe GenAI Tasks
Inference API for Azure-supported AI models
Complete WASM toolkit for edge AI: vector search, graph DB, neural networks, DAG workflows, SQL/SPARQL/Cypher, and ONNX inference - all running in browser
The **Cerebras provider** for the [AI SDK](https://ai-sdk.dev/docs) contains language model support for [Cerebras](https://cerebras.ai), offering high-speed AI model inference powered by Cerebras Wafer-Scale Engines and CS-3 systems.
Hardware accelerated language model chats on browsers
Client for the Model Context Protocol
Tagged template literal for Sanity.io GROQ-queries
| [NPM Package](https://www.npmjs.com/package/@mlc-ai/web-tokenizers) | [WebLLM](https://github.com/mlc-ai/web-llm) |
TypeScript-first schema declaration and validation library with static type inference
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
OCI NodeJS client for Generative Ai Inference Service
The **[Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index)** for the [Vercel AI SDK](https://ai-sdk.dev/docs) contains language model support for thousands of models through multiple inference providers via the Hugging