Pretrained MobileNet in TensorFlow.js
npm install @tensorflow-models/mobilenetMobileNets are small, low-latency, low-power models parameterized to meet the resource constraints of a variety of use cases. They can be built upon for classification, detection, embeddings and segmentation similar to how other popular large scale models, such as Inception, are used.
MobileNets trade off between latency, size and accuracy while comparing favorably with popular models from the literature.
This TensorFlow.js model does not require you to know about machine learning.
It can take as input any browser-based image elements (, ,
elements, for example) and returns an array of most likely predictions and
their confidences.
For more information about MobileNet, check out this readme in
tensorflow/models.
There are two main ways to get this model in your JavaScript project: via script tags or by installing it from NPM and using a build tool like Parcel, WebPack, or Rollup.
``html

`
`js
// Note: you do not need to import @tensorflow/tfjs here.
const mobilenet = require('@tensorflow-models/mobilenet');
const img = document.getElementById('img');
// Load the model.
const model = await mobilenet.load();
// Classify the image.
const predictions = await model.classify(img);
console.log('Predictions: ');
console.log(predictions);
`
#### Loading the model
mobilenet is the module name, which is automatically included when you use the