Serving a TFLite Model
Learn to use a TFLite model for inference.
We'll cover the following...
We'll cover the following...
We can use the TFLite model for inference on our datasets.
The tensorflow package supports TFLite and comes with built-in functions for inference.
Utility functions
We’ll need the softmax utility function to convert the tensor output to probabilities. Implement it by entering the following code:
Python
import numpy as npimport tensorflow as tfimport osos.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'def softmax(vec):exponential = np.exp(vec)probabilities = exponential / np.sum(exponential)return probabilities# test softmax functiondummy_vec = np.array([1.5630065, -0.24305986, -0.08382231, -0.4424621])print('The output probabilities after softmax is:', softmax(dummy_vec))
Note: ...