Node JS extended Buffer
npm install extended-buffer
ExtendedBuffer is a growable binary buffer built on top of Node.js Buffer.
It keeps an internal read pointer (similar to a stream cursor) and supports appending data at the end or prepending data at the start.
---
bash
npm install extended-buffer
`---
Browser usage (bundlers)
ExtendedBuffer works in browsers as long as a Buffer polyfill is available. Most bundlers can use the buffer
package as a drop-in implementation.Install the polyfill:
`bash
npm install buffer
`ExtendedBuffer imports Buffer from the buffer module, so a global Buffer is usually not required.
If your tooling (or other dependencies) expects a global Buffer, add a small shim in your app entry:`ts
import { Buffer } from "buffer";
import { getGlobalContext } from "extended-buffer";const globalScope = getGlobalContext();
if (globalScope && !(globalScope as any).Buffer) {
(globalScope as any).Buffer = Buffer;
}
`$3
`js
// webpack.config.js
const webpack = require("webpack");module.exports = {
resolve: {
fallback: {
buffer: require.resolve("buffer/")
}
},
plugins: [
new webpack.ProvidePlugin({
Buffer: ["buffer", "Buffer"]
})
]
};
`$3
`ts
// vite.config.ts
import { defineConfig } from "vite";export default defineConfig({
resolve: {
alias: {
buffer: "buffer/"
}
},
optimizeDeps: {
include: ["buffer"]
}
});
`$3
`js
// rollup.config.js
import resolve from "@rollup/plugin-node-resolve";
import commonjs from "@rollup/plugin-commonjs";
import inject from "@rollup/plugin-inject";export default {
plugins: [
resolve({ browser: true, preferBuiltins: false }),
commonjs(),
inject({
Buffer: ["buffer", "Buffer"]
})
]
};
`Notes:
- BigInt read/write methods require a polyfill that supports Node’s BigInt Buffer APIs.
- If your tooling already provides a Buffer global, you can skip the shim/inject steps.
---
Quick start
`ts
import { ExtendedBuffer } from 'extended-buffer';const b = new ExtendedBuffer();
b.writeString("OK"); // append
b.writeUInt16BE(1337); // append
console.log(b.readString(2)); // "OK"
console.log(b.readUInt16BE()); // 1337
`---
Core concepts
$3
The buffer stores a contiguous region of bytes. A separate read pointer tracks how many bytes were already consumed.
-
length — total stored bytes (including already-read bytes).
- getReadableSize() — unread bytes remaining.
- pointer / getPointer() — current read pointer (0…length).
- nativePointer() — absolute index inside the underlying Buffer for the next read.$3
-
nativeBufferView — a Buffer view of all stored bytes (from the start of stored data to the end).
- bufferView — a new ExtendedBuffer instance that maps to the same bytes as nativeBufferView (zero-copy).
The new instance starts with pointer = 0, so you can read/parse without consuming the parent buffer.If you need only unread bytes, you can derive it:
`ts
const unread = b.nativeBufferView.subarray(b.pointer);
`Example: parse without touching the parent pointer:
`ts
import { ExtendedBuffer } from 'extended-buffer';const b = new ExtendedBuffer();
b.writeString("OK");
b.writeUInt16BE(1337);
const v = b.bufferView;
console.log(v.readString(2)); // "OK"
console.log(v.readUInt16BE()); // 1337
console.log(b.pointer); // 0 (parent is untouched)
`Notes:
-
bufferView shares memory with the parent. In-place mutations (e.g. v.nativeBufferView[0] = 0xff) will be visible in both.
- The view usually has no spare head/tail capacity, so calling v.write*() will likely trigger a reallocation (copy) and then the two instances will diverge.
---
Construction and options
`ts
type ExtendedBufferOptions = {
capacity?: number; // initial native buffer size (bytes)
capacityStep?: number; // how much to grow when resizing
nativeAllocSlow?: boolean; // using Buffer.allocUnsafeSlow() when initializing ExtendedBuffer
nativeReallocSlow?: boolean; // using Buffer.allocUnsafeSlow() for further reallocations
initNativeBuffer?: Buffer; // use an existing Buffer as the initial native buffer (no copy)
unsafeMode?: boolean; // skip most runtime asserts inside ExtendedBuffer (for performance)
};
`Default values:
-
capacity: 16 * 1024 bytes (16 KiB)
- capacityStep: 16 * 1024 bytes (same as the default capacity; if you override capacity and want the same step, set capacityStep explicitly)
- nativeAllocSlow: false
- nativeReallocSlow: falseExample:
`ts
const b = new ExtendedBuffer({
capacity: 1024 * 1024,
capacityStep: 1024 * 1024,
nativeAllocSlow: true,
nativeReallocSlow: true,
unsafeMode: false
});
`$3
unsafeMode disables internal assert* validations inside ExtendedBuffer (and skips assertInstanceState()).
It is useful in hot paths when all parameters are controlled by your code, but it can make misuse fail later (or in less obvious ways).Notes:
- Range checks like
SIZE_OUT_OF_RANGE / POINTER_OUT_OF_RANGE are still enforced.
- unsafeMode is inherited by derived instances created via bufferView and readBuffer(...) (unless overridden via options).$3
If you already have a Node.js
Buffer (from a socket, file, etc.) and want to parse it with ExtendedBuffer
without copying, pass it as initNativeBuffer.- The buffer is not copied — it becomes the internal
_nativeBuffer.
- The instance starts with pointer = 0 and length = initNativeBuffer.length.`ts
import { ExtendedBuffer } from 'extended-buffer';const packet = Buffer.from([0x00, 0x02, 0x4f, 0x4b]); // 2, "OK"
const b = new ExtendedBuffer({ initNativeBuffer: packet });
const len = b.readUInt16BE();
console.log(b.readString(len)); // "OK"
`You can also reuse an instance:
`ts
b.initExtendedBuffer(packet);
`Note: when you construct from
initNativeBuffer, the buffer is treated as already filled.
If you later call write*(), it will typically require a reallocation (copy) to make room.---
Writing data
Most write methods accept an optional
unshift?: boolean:-
unshift = false (default): append to the end
- unshift = true: prepend to the start$3
`ts
b.writeNativeBuffer(Buffer.from([1, 2, 3]));
b.writeBuffer(Buffer.from([4, 5, 6])); // alias that also accepts ExtendedBuffer
b.writeString("hello", "utf8");
`Prepend example:
`ts
b.writeString("payload");
b.writeUInt16BE(7, true); // prepend length/header
`$3
Variable-width (size must be 1…6 bytes):
`ts
b.writeIntBE(-10, 3);
b.writeUIntLE(5000, 4);
`Fixed-width helpers:
-
writeInt8, writeUInt8
- writeInt16BE, writeInt16LE, writeUInt16BE, writeUInt16LE
- writeInt32BE, writeInt32LE, writeUInt32BE, writeUInt32LE$3
If your runtime supports
BigInt and Node's Buffer.readBig / Buffer.writeBig APIs, you can read/write 64-bit integers as bigint values (always 8 bytes):-
writeBigInt64BE, writeBigInt64LE — signed 64-bit
- writeBigUInt64BE, writeBigUInt64LE — unsigned 64-bit`ts
import { ExtendedBuffer } from 'extended-buffer';if (typeof BigInt === 'function') {
const b = new ExtendedBuffer();
const twoPow63 = BigInt('9223372036854775808'); // 2^63
b.writeBigUInt64BE(twoPow63);
b.writeBigInt64LE(BigInt(-42));
b.setPointer(0);
console.log(String(b.readBigUInt64BE())); // "9223372036854775808"
console.log(String(b.readBigInt64LE())); // "-42"
}
`If
BigInt is not supported, these methods throw ExtendedBufferUnsupportedError('EXECUTION_ENVIRONMENT_NOT_SUPPORT_BIG_INT').$3
-
writeFloatBE, writeFloatLE (4 bytes)
- writeDoubleBE, writeDoubleLE (8 bytes)---
Reading data
All
read* methods advance the internal read pointer (consume bytes).
If there aren’t enough readable bytes, they throw ExtendedBufferRangeError('SIZE_OUT_OF_RANGE').$3
`ts
if (b.isReadable(4)) {
const x = b.readUInt32BE();
}
`$3
`ts
// Copy out as a native Buffer
const chunk: Buffer = b.readBuffer(10, true);// Copy out as a new ExtendedBuffer (same capacity/capacityStep/nativeAllocSlow/nativeReallocSlow/unsafeMode by default)
const eb: ExtendedBuffer = b.readBuffer(10);
`$3
`ts
const s = b.readString(5, "utf8");
`$3
Variable-width (size 1…6 bytes):
`ts
const a = b.readIntBE(3);
const u = b.readUIntLE(4);
`Fixed-width helpers:
-
readInt8, readUInt8
- readInt16BE, readInt16LE, readUInt16BE, readUInt16LE
- readInt32BE, readInt32LE, readUInt32BE, readUInt32LE$3
-
readBigInt64BE, readBigInt64LE — signed 64-bit
- readBigUInt64BE, readBigUInt64LE — unsigned 64-bit`ts
const b = new ExtendedBuffer();if (typeof BigInt === 'function') {
b.writeBigInt64BE(BigInt(-1));
b.writeBigUInt64BE(BigInt('18446744073709551615')); // 2^64 - 1
b.setPointer(0);
console.log(String(b.readBigInt64BE())); // "-1"
console.log(String(b.readBigUInt64BE())); // "18446744073709551615"
}
`Note: Node's
Buffer will throw a native RangeError if the value doesn't fit into signed/unsigned 64-bit range.$3
-
readFloatBE, readFloatLE
- readDoubleBE, readDoubleLE---
Pointer control (peeking / rewinding)
$3
`ts
const p = b.pointer;
const header = b.readUInt16BE();// decide what to do...
b.setPointer(p); // rewind back to before header
`$3
`ts
b.offset(4); // skip 4 bytes
b.offset(-2); // go back 2 bytes (must stay within 0…length)
`If you try to set the pointer outside
[0, length], it throws
ExtendedBufferRangeError('POINTER_OUT_OF_RANGE').---
Transactions (atomic changes)
Sometimes you want to perform a multi-step read/write and either:
- commit everything if it succeeds, or
- rollback the buffer to the exact previous state if something fails.
ExtendedBuffer.transaction() wraps your code in a transaction:`ts
const result = b.transaction(() => {
// any reads/writes/offsets/etc.
return 123;
});
`Rules:
- If the callback returns normally, changes are kept (committed).
- If the callback throws, the buffer is restored (rolled back) and the error is re-thrown.
- The callback must be synchronous (returned Promises are not awaited).
- Transactions are re-entrant: nested
transaction() calls do not create extra snapshots.What gets rolled back:
- stored payload bytes
-
pointer (read pointer)
- internal start/end offsets and the original native Buffer (even if the buffer was reallocated during the callback)$3
This is useful for protocols where you might receive partial data and want to retry later.
`ts
import { ExtendedBuffer } from 'extended-buffer';function tryReadFrame(b: ExtendedBuffer): Buffer | null {
try {
return b.transaction(() => {
// (1) read header
const len = b.readUInt16BE();
// (2) not enough bytes yet -> rollback and let the caller wait for more data
if (!b.isReadable(len)) {
throw new Error('INCOMPLETE_FRAME');
}
// (3) success -> commit
return b.readBuffer(len, true);
});
} catch {
return null;
}
}
`$3
`ts
b.transaction(() => {
const magic = b.readUInt32BE();
if (magic !== 0xdeadbeef) {
throw new Error('BAD_MAGIC');
} const version = b.readUInt8();
if (version !== 1) {
throw new Error('UNSUPPORTED_VERSION');
}
});
`$3
transaction() snapshots the current payload (it copies the stored bytes) before running the callback.
That makes rollbacks safe, but can be expensive for very large buffers. Use it for small/medium payloads,
or when the safety/ergonomics is worth the extra copy.---
Memory management
$3
If you continuously read from the buffer, you can drop the consumed prefix:
`ts
b.discardReadData();
`This moves the internal start forward by the number of read bytes and resets
pointer to 0.$3
`ts
b.gc();
`gc() first discards read data, then may shrink the underlying native Buffer
when free space exceeds capacityStep.$3
`ts
b.clean(); // alias for initExtendedBuffer()
`---
Errors
The library defines these error classes:
-
ExtendedBufferError
- ExtendedBufferTypeError
- ExtendedBufferRangeError
- ExtendedBufferUnsupportedErrorCommon error codes you may see:
-
SIZE_OUT_OF_RANGE: reading more bytes than available
- POINTER_OUT_OF_RANGE: setting pointer outside 0…length
- INVALID_INTEGER_SIZE_VALUE_TYPE: size is not a safe integer
- INVALID_INTEGER_SIZE_VALUE_RANGE: integer size not in 1…6
- INVALID_INSTANCE_STATE: internal invariant check failed
- INVALID_BUFFER_TYPE: attempt write invalid buffer type
- VALUE_MUST_BE_AN_INTEGER: value not a safe integer
- VALUE_MUST_BE_AN_UNSIGNED_INTEGER: value is not a safe integer or less than 0
- VALUE_MUST_BE_AN_BIG_INTEGER: value is not a bigint
- VALUE_MUST_BE_AN_UNSIGNED_BIG_INTEGER: value is not a bigint or less than 0
- EXECUTION_ENVIRONMENT_NOT_SUPPORT_BIG_INT: BigInt methods are not supported in the current runtime
- EXCEEDING_MAXIMUM_BUFFER_SIZE: allocation exceeds Node’s maximum buffer size (kMaxLength or os.totalmem() when available)---
Caveats
$3
unshift=true prepends bytes by moving the internal start pointer, but the read pointer is not adjusted automatically.
If you prepend after consuming bytes, you may get surprising results (e.g., some previously read bytes can become readable again, or newly prepended bytes may be skipped).A safe pattern is:
`ts
b.discardReadData();
b.writeUInt16BE(123, true);
`$3
nodeGc() calls gc() on the detected global object if it exists. In Node.js it requires starting the process with --expose-gc.
In browsers/non-Node runtimes it simply no-ops.---
Reference: full public API (names)
Properties:
-
length, capacity, pointer, nativeBufferView, bufferViewCore:
-
initExtendedBuffer(initNativeBuffer?), assertInstanceState(), clean()
- setUnsafeMode(unsafeMode)
- nativePointer(), getWritableSizeStart(), getWritableSizeEnd(), getWritableSize(), getReadableSize()
- transaction(callback)
- allocStart(size), allocEnd(size)
- writeNativeBuffer(buf, unshift?), writeBuffer(bufOrEB, unshift?), writeString(str, enc?, unshift?)
- Pointer: setPointer(p), getPointer(), offset(n), isReadable(size)
- Maintenance: discardReadData(), gc(), nodeGc()Numbers:
- Write:
writeIntBE/LE, writeUIntBE/LE, writeInt8, writeUInt8,
writeInt16BE/LE, writeUInt16BE/LE, writeInt32BE/LE, writeUInt32BE/LE,
writeBigInt64BE/LE, writeBigUInt64BE/LE,
writeFloatBE/LE, writeDoubleBE/LE
- Read: readBuffer, readString,
readIntBE/LE, readUIntBE/LE, readInt8, readUInt8,
readInt16BE/LE, readUInt16BE/LE, readInt32BE/LE, readUInt32BE/LE,
readBigInt64BE/LE, readBigUInt64BE/LE,
readFloatBE/LE, readDoubleBE/LE`MIT