Utility function for streaming JSON from LLM libraries such as Genkit
npm install json-response-stream

A tiny, powerful utility for streaming JSON from API responses, LLMs (like Genkit), and anywhere else you're getting a stream of JSON objects.
- ๐งฉ Parse multiple JSON objects from a stream
- ๐ Handle partial JSON data across multiple chunks
- ๐ Skip duplicate objects automatically
- ๐ก๏ธ Safely parse without throwing errors
- ๐งต Works great with fetch, LLM APIs, or any ReadableStream
``bash`
npm install json-response-streamor
yarn add json-response-streamor
pnpm add json-response-stream
`typescript
import { jsonParser } from 'json-response-stream'
// Let's fetch some streaming data!
const response = await fetch('https://api.example.com/stream')
// The magic happens here โจ
for await (const data of jsonParser(response.body)) {
console.log('Got new data:', data)
// Do something cool with each JSON object as it arrives
}
`
`typescript
import { jsonParser } from 'json-response-stream'
interface User {
id: number
name: string
role: string
}
const response = await fetch('https://api.example.com/users/stream')
// Type safety! ๐ก๏ธ
for await (const user of jsonParser
console.log(Welcome, ${user.name} the ${user.role}!)`
}
`typescript
// server side:
const { stream } = await ai.generateStream('Suggest a complete menu for a pirate themed restaurant.')
const transformedStream = ReadableStream.from(stream).pipeThrough(
new TransformStream({
transform(chunk, controller) {
controller.enqueue(JSON.stringify(chunk))
}
})
)
// Send the stream to the client
sendStream(event, transformedStream)
// Client side:
const response = await fetch('/api/genkit/stream')
// Process each chunk as it comes in!
for await (const idea of jsonParser(response.body)) {
console.log(chunk.text)
}
`
#### jsonParser
The star of the show! Creates an async iterator that processes a stream and yields JSON objects as they become available.
`typescript
// Advanced example with fetch and AbortController
const controller = new AbortController()
const response = await fetch('https://api.example.com/stream', {
signal: controller.signal
})
setTimeout(() => controller.abort(), 5000) // Cancel after 5 seconds
try {
for await (const data of jsonParser(response.body)) {
console.log(data)
}
} catch (error) {
if (error.name === 'AbortError') {
console.log('Stream reading was cancelled! ๐')
} else {
console.error('Error processing stream:', error)
}
}
`
#### jsonTransformStream
Creates a TransformStream for processing JSON chunks. Useful if you're working directly with the Streams API.
`typescript`
const transformedStream = someReadableStream
.pipeThrough(new TextDecoderStream())
.pipeThrough(jsonTransformStream())
#### safeParse
A JSON.parse wrapper that never throws - returns null on invalid JSON.
#### hashString(str: string): string
A simple string hashing function used internally to detect duplicates.
Streaming APIs are awesome, but dealing with chunked JSON can be a pain. This library makes it seamless to:
- Process large datasets without waiting for the entire response
- Handle real-time updates from LLMs and other streaming APIs
- Avoid the headaches of parsing partial JSON chunks
Contributions welcome! Open an issue or PR on GitHub.
MIT - do awesome things with it!
---
1. Update the version in package.jsonnpm run build && npm publish`
2. Run
Made with โค๏ธ by Jamie Curnow