TensorFlow Decision Forests support for TensorFlow.js
npm install @tensorflow/tfjs-tfdfThis package enables users to run arbitrary Tensorflow Decision Forests models
on the web that are converted using tfjs-converter.
Users can load a TFDF model from a URL, use TFJS tensors to set
the model's input data, run inference, and get the output back in TFJS tensors.
Under the hood, the TFDF C++ runtime is packaged in a set of WASM modules.
To use this package, you will need a TFJS backend installed.
You will also need to import @tensorflow/tfjs-core for
manipulating tensors, and @tensorflow/tfjs-converter for loading models.
``js`
// Import @tensorflow/tfjs-core
import * as tf from '@tensorflow/tfjs-core';
// Adds the CPU backend.
import '@tensorflow/tfjs-backend-cpu';
// Import @tensorflow/tfjs-converter
import * as tf from '@tensorflow/tfjs-converter';
// Import @tensorflow/tfjs-tfdf.
import * as tfdf from '@tensorflow/tfjs-tfdf';
`html`
By default, it will try to load the WASM modules from the same location where
the package or your own script is served. Use setLocateFile to set your ownsrc/tfdf_web_api_client.d.ts
location. See for more details.
`jsbase
// is the URL to the main javascript file's directory.${base}${path}
// To return the default URL of the file use .https://your-server/.../${path}
tfdf.setLocateFile((path, base) => {
return ;`
});
js
const tfdfModel = await tfdf.loadTFDFModel('url/to/your/model.json');
`Run inference
`js
// Prepare input tensors.
const input = tf.tensor1d(['test', 'strings']);// Run inference and get output tensors.
const outputTensor = await tfdfModel.executeAsync(input);
console.log(outputTensor.dataSync());
`Development
Building
`sh
$ yarn
$ yarn build
`Testing
`sh
$ yarn test
`Deployment
`sh
$ yarn build-npm
``