Showing 1-20 of 27 packages
Provide a better streaming api for your module.
streamin
LexSocket TED MCP Server: Real-time access to EU public procurement opportunities from Tenders Electronic Daily (TED) — the official Supplement to the Official Journal of the European Union. Provides AI agents with hallucination-free, low-latency streamin
VK Streamin API client for Node
Buffer streamin JSON
A module for using the tesla streamin API
Effortlessly stream and present your Markdown slides live in-browser with MarkDownLive. Ideal for educators, presenters, and anyone needing to create dynamic, real-time presentations. Simply write in Markdown, and let MarkDownLive handle the live streamin
## Install
A code and code-diff prettyprinter for the terminal, powered by Shiki
A high-performance markdown parser with MDC (Markdown Components) support, offering both string-based and streaming APIs.
Search for a pattern in a stream as fast as JavaScriptly possible.
23 secs ago - Still Now Here Option’s to Downloading or watching Scream 6 (Scream VI) streamin the full movie online for free. Do you like movies? If so, then you’ll love New Romance Movie: Scream 6 (Scream VI). This movie is one of the best in its ge
## Overview
Time-based Streaming Chunking
`@chenchaolong/plugin-vllm7` provides a model adapter for connecting vLLM inference services to the [XpertAI](https://github.com/xpert-ai/xpert) platform. The plugin communicates with vLLM clusters via an OpenAI-compatible API, enabling agents to invoke c
`@chenchaolong/plugin-vllm2` provides a model adapter for connecting vLLM inference services to the [XpertAI](https://github.com/xpert-ai/xpert) platform. The plugin communicates with vLLM clusters via an OpenAI-compatible API, enabling agents to invoke c
`@chenchaolong/plugin-vllm6` provides a model adapter for connecting vLLM inference services to the [XpertAI](https://github.com/xpert-ai/xpert) platform. The plugin communicates with vLLM clusters via an OpenAI-compatible API, enabling agents to invoke c
`@chenchaolong/plugin-vllm5` provides a model adapter for connecting vLLM inference services to the [XpertAI](https://github.com/xpert-ai/xpert) platform. The plugin communicates with vLLM clusters via an OpenAI-compatible API, enabling agents to invoke c
> For Doing Cool & Idiomatic Things with Streams, Maybe
`@chenchaolong/plugin-vllm8` provides a model adapter for connecting vLLM inference services to the [XpertAI](https://github.com/xpert-ai/xpert) platform. The plugin communicates with vLLM clusters via an OpenAI-compatible API, enabling agents to invoke c