Source: sg.channelasia.tech
Google has announced TensorFlow Lite Model Maker, a tool for converting an existing TensorFlow model to the TensorFlow Lite format used to serve predictions on lightweight hardware such as mobile devices.
TensorFlow models can be quite large, and serving predictions remotely from beefy hardware capable of handling them isn’t always possible.
Google created the TensorFlow Lite model format to make it more efficient to serve predictions locally, but creating a TensorFlow Lite version of a model previously required some work.
In a blog post, Google described how TensorFlow Lite Model Maker adapts existing TensorFlow models to the Lite format with only a few lines of code.
The adaptation process uses one of a small number of task types to evaluate the model and generate a Lite version. The downside is that only a couple of task types are available for use right now — i.e., image and text classification — so models for other tasks (e.g., machine vision) aren’t yet supported.
Other TensorFlow Lite tools announced in the same post include a tool to automatically generate platform-specific wrapper code to work with a given model.
Because hand-coding wrappers for models can be error-prone, the tool automatically generates the wrapper from metadata in the model autogenerated by Model Maker. The tool is currently available in a pre-release beta version, and supports only Android right now, with plans to eventually integrate it into Android Studio.