PyTorch/libtorch compiled to WebAssembly for TypeScript
npm install torch-typescriptPyTorch/libtorch compiled to WebAssembly for TypeScript.
This package provides TypeScript bindings for PyTorch tensor operations compiled to WebAssembly using Emscripten. It works in both Node.js and browser environments.
- Tensor creation (ones, zeros, from data)
- Basic tensor operations (add, mul, matmul)
- torch-typescript model loading and inference
- Works in Node.js and browsers via WebAssembly
``bash`
npm install torch-typescriptor
pnpm add torch-typescript
`typescript
import { loadModule, ones, zeros, tensor, DType } from 'torch-typescript';
// Initialize the WASM module
await loadModule();
// Create tensors
const a = await ones([2, 3]);
const b = await zeros([2, 3]);
const c = await tensor([1, 2, 3, 4, 5, 6], [2, 3]);
// Tensor operations
const sum = a.add(b);
const product = a.mul(c);
// Matrix multiplication
const x = await ones([2, 3]);
const y = await ones([3, 4]);
const result = x.matmul(y);
// Get tensor data
console.log(result.shape); // [2, 4]
console.log(result.toArray());
// Free tensors when done
a.free();
b.free();
c.free();
sum.free();
product.free();
x.free();
y.free();
result.free();
`
`typescript
import { loadModule, load, tensor } from 'torch-typescript';
await loadModule();
// Load a torch-typescript model
const model = await load('model.pt');
// Create input tensor
const input = await tensor([/ your input data /], [1, 3, 224, 224]);
// Run inference
const output = model.forward(input);
console.log(output.shape);
console.log(output.toArray());
// Clean up
input.free();
output.free();
model.free();
`
- Docker
- Node.js 18+
- pnpm
1. Build the Docker image with Emscripten and PyTorch:
`bash`
pnpm wasm:docker-build
2. Build the WASM module:
`bash`
pnpm wasm:build
3. Build the TypeScript wrapper:
`bash`
pnpm build:js
Or run all steps at once:
`bash`
pnpm build
Note: The initial Docker build is slow as it compiles PyTorch from source.
- loadModule(): Promise - Initialize the WASM module (must be called before other operations)version(): Promise
- - Get PyTorch versionones(shape: number[], dtype?: DType): Promise
- - Create tensor filled with oneszeros(shape: number[], dtype?: DType): Promise
- - Create tensor filled with zerostensor(data: number[] | Float32Array, shape: number[]): Promise
- - Create tensor from dataload(filename: string): Promise
- - Load a torch-typescript model
- shape: number[] - Tensor dimensionsnumel: number
- - Number of elementstoArray(): Float32Array
- - Get tensor dataadd(other: Tensor): Tensor
- - Element-wise additionmul(other: Tensor): Tensor
- - Element-wise multiplicationmatmul(other: Tensor): Tensor
- - Matrix multiplicationfree(): void
- - Free tensor memory
- forward(input: Tensor): Tensor - Run model inferencefree(): void
- - Free module memory
- DType.Float32 - 32-bit floating pointDType.Float64
- - 64-bit floating pointDType.Int32
- - 32-bit integerDType.Int64` - 64-bit integer
-
MIT
Based on:
- libtorch-wasm - PyTorch WASM compilation
- libpg-query-node - Package structure patterns