Showing 1-20 of 52 packages
WebLLM client library for browser integration
| [NPM Package](https://www.npmjs.com/package/@mlc-ai/web-tokenizers) | [WebLLM](https://github.com/mlc-ai/web-llm) |
WebLLM provider for ActosVoice
WebLLM provider for Vercel AI SDK v5+ (High-performance in-browser LLM inference)
| [NPM Package](https://www.npmjs.com/package/@mlc-ai/web-tokenizers) | [WebLLM](https://github.com/mlc-ai/web-llm) |
Verified, resumable model loading for WebLLM. Integrity verification for AI models in the browser.
Hardware accelerated language model chats on browsers
High-performance in-browser LLM inference for Vue.js using WebLLM and WebGPU
A customizable, Fluent UI–based AI chat widget for React applications. Supports light/dark themes, persistent chat history, maintenance mode, draggable UI, and pluggable AI engines (WebLLM, OpenAI, etc.).
**Bringing the power of Large Language Models (LLMs) directly to your Angular applications with WebLLM.**
OpenAI-compatible middleware for running WebLLM models locally with offline support
WebLLM provider for @localmode - LLM inference with quantized models
createAIContext() with Fake + WebLLM adapters
AI-focused UI components for building LLM-powered interfaces with WebLLM
WebLLM provider for Vercel AI SDK (V2 specification)
A TypeScript-first library for programmatic browser control, designed for building AI-powered web agents.
Hardware accelerated language model chats on browsers
Hardware accelerated language model chats on browsers