Chop a single stream of data into a series of readable streams
npm install stream-chopperChop a single stream of data into a series of readable streams.




Stream Chopper is useful in situations where you have a stream of data
you want to chop up into smaller pieces, either based on time or size.
Each piece will be emitted as a readable stream (called output streams).
Possible use-cases include log rotation, splitting up large data sets,
or chopping up a live stream of data into finite chunks that can then be
stored.
Sometimes it's important to ensure that a chunk written to the input
stream isn't split up and devided over two output streams. Stream
Chopper allows you to specify the chopping algorithm used (via thetype option) when a chunk is too large to fit into the current output
stream.
By default a chunk too large to fit in the current output stream is
split between it and the next. Alternatively you can decide to either
allow the chunk to "overflow" the size limit, in which case it will be
written to the current output stream, or to "underflow" the size limit,
in which case the current output stream will be ended and the chunk
written to the next output stream.
```
npm install stream-chopper --save
Example app:
`js
const StreamChopper = require('stream-chopper')
const chopper = new StreamChopper({
size: 30, // chop stream when it reaches 30 bytes,
time: 10000, // or when it's been open for 10s,
type: StreamChopper.overflow // but allow stream to exceed size slightly
})
chopper.on('stream', function (stream, next) {
console.log('>> Got a new stream! <<')
stream.pipe(process.stdout)
stream.on('end', next) // call next when you're ready to receive a new stream
})
chopper.write('This write contains more than 30 bytes\n')
chopper.write('This write contains less\n')
chopper.write('This is the last write\n')
`
Output:
``
>> Got a new stream! <<
This write contains more than 30 bytes
>> Got a new stream! <<
This write contains less
This is the last write
Instantiate a StreamChopper instance. StreamChopper is a [writable]
stream.
Takes an optional options object which, besides the normal optionsWritable
accepted by the [][writable] class, accepts the following
config options:
- size - The maximum number of bytes that can be written to thechopper
stream before a new output stream is emitted (default:Infinity
)time
- - The maximum number of milliseconds that an output stream can-1
be in use before a new output stream is emitted (default: whichtype
means no limit)
- - Change the algoritm used to determine how a written chunkStreamChopper.split
that cannot fit into the current output stream should be handled. The
following values are possible:
- - Fit as much data from the chunk as possibleStreamChopper.overflow
into the current stream and write the remainder to the next stream
(default)
- - Allow the entire chunk to be written toStreamChopper.underflow
the current stream. After writing, the stream is ended
- - End the current output stream and writetransform
the entire chunk to the next stream
- - An optional function that returns a transform streamsize
used for transforming the data in some way (e.g. a zlib Gzip stream).
If used, the option will count towards the size of the outputStreamChopper.split
chunks. This config option cannot be used together with the
type
If type is StreamChopper.underflow and the size of the chunk to besize
written is larger than an error is emitted.
Emitted every time a new output stream is ready. You must listen for
this event.
The listener function is called with two arguments:
- stream - A [readable] output streamnext
- - A function you must call when you're ready to receive a newchopper
output stream. If called with an error, the stream is
destroyed
The maximum number of bytes that can be written to the chopper stream
before a new output stream is emitted.
Use this property to override it with a new value. The new value will
take effect immediately on the current stream.
The maximum number of milliseconds that an output stream can be in use
before a new output stream is emitted.
Use this property to override it with a new value. The new value will
take effect when the next stream is initialized. To change the current
timer, see chopper.resetTimer().
Set to -1 for no time limit.
The algoritm used to determine how a written chunk that cannot fit into
the current output stream should be handled. The following values are
possible:
- StreamChopper.split - Fit as much data from the chunk as possibleStreamChopper.overflow
into the current stream and write the remainder to the next stream
- - Allow the entire chunk to be written toStreamChopper.underflow
the current stream. After writing, the stream is ended
- - End the current output stream and write
the entire chunk to the next stream
Use this property to override it with a new value. The new value will
take effect immediately on the current stream.
Manually chop the stream. Forces the current output stream to end even
if its size limit or time timeout hasn't been reached yet.
Arguments:
- callback - An optional callback which will be called once the output
stream have ended
Use this function to reset the current timer (configured via the time
config option). Calling this function will force the current timer to
start over.
If the optional time argument is provided, this value is used as the
new time. This is equivilent to calling:
`js`
chopper.time = time
chopper.resetTimer()
If the function is called with time set to -1`, the current timer is
cancelled and the time limit is disabled for all future streams.
[writable]: https://nodejs.org/api/stream.html#stream_class_stream_writable
[readable]: https://nodejs.org/api/stream.html#stream_class_stream_readable