Search⌘ K
AI Features

Serving a TFLite Model

Explore how to serve TFLite models for image classification. Learn to preprocess images, load models with TensorFlow Interpreter, run inference, and interpret output probabilities to get top predictions.

We can use the TFLite model for inference on our datasets.

The tensorflow package supports TFLite and comes with built-in functions for inference.

Utility functions

We’ll need the softmax utility function to convert the tensor output to probabilities. Implement it by entering the following code:

Python
import numpy as np
import tensorflow as tf
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
def softmax(vec):
exponential = np.exp(vec)
probabilities = exponential / np.sum(exponential)
return probabilities
# test softmax function
dummy_vec = np.array([1.5630065, -0.24305986, -0.08382231, -0.4424621])
print('The output probabilities after softmax is:', softmax(dummy_vec))

Note: ...