Showing 1-20 of 51,207 packages
Hazelcast - a real-time stream processing platform - Node.js Client
An extension to Azure Communication Services Calling Web SDK that provides stream processing features.
Stream processing for JS simplified
- Convert a **stream of token** into a **parsable JSON** object before the stream ends. - Implement **Streaming UI** in **LLM**-based AI application. - Leverage **OpenAI Function Calling** for early stream processing. - Parse **JSON stream** into distinct
Lightning-fast file type detection using magic bytes (file signatures) with a focus on stream processing and minimal memory usage
Audio/Video stream processing library for JavaScript world
Distributed stream processing engine.
A modern type-safe stream processing library inspired by JavaScript Generator, Java Stream, and MySQL Index. Supports lazy evaluation, async streams, statistics, and IO-like operations.
Stream processing and buffer operations with size limit enforcement
A library providing an expression language to compose and evaluate stream processing
Stream processing framework for creating distributed data pipelines and autonomic neural networks
Declarative event stream processing library
🚀 Probably the best Node.js HTTP request component, It also contains a rich stream processing
CLI for Tree stream processing.
A collection of type-safe stream helpers built on top of [Effection](https://github.com/thefrontside/effection) for efficient and controlled stream processing.
stream-json is the micro-library of Node.js stream components for creating custom JSON processing pipelines with a minimal memory footprint. It can parse JSON files far exceeding available memory streaming individual primitives using a SAX-inspired API. I
SQL is designed for managing or stream processing data in an RDBMS.
[![Build status][build-image]][build-url] [![Tests coverage][cov-image]][cov-url] [![npm version][npm-image]][npm-url]
Chain functions as transform streams.
A npm library for creating and managing Remote Procedure Call (RPC) services, handling various data transformations, stream processing, and context defaults. It provides a structured approach to implementing RPC services