# Summary: Understanding TensorFlow 2

Review what we've learned in this chapter.

## TensorFlow architecture

In this chapter, we took our first steps to solving NLP tasks by understanding the primary underlying platform (TensorFlow) on which we’ll be implementing our algorithms. First, we discussed the underlying details of TensorFlow architecture. Next, we discussed the essential ingredients of a meaningful TensorFlow program. We got to know some new features in TensorFlow 2, such as the AutoGraph feature, in depth. We then discussed more exciting elements in TensorFlow, such as data pipelines and various TensorFlow operations.

Specifically, we discussed the TensorFlow architecture by lining up the explanation with an example TensorFlow program: the sigmoid example. In this TensorFlow program, we used the AutoGraph feature to generate a TensorFlow graph—that is, using the `tf.function()`

decorator over the function that performs the TensorFlow operations. Then, a `GraphDef`

object was created representing the graph and sent to the distributed manager. The distributed manager looked at the graph, decided which components to use for the relevant computation, and divided it into several subgraphs to make the computations faster. Finally, workers executed subgraphs and returned the result immediately.

### TensorFlow client

Next, we discussed the various elements that comprise a typical TensorFlow client: inputs, variables, outputs, and operations. Inputs are the data we feed to the algorithm for training and testing purposes. We discussed three different ways of feeding inputs: using NumPy arrays, preloading data as TensorFlow tensors, and using `tf.data`

to define an input pipeline. Then we discussed TensorFlow variables, how they differ from other tensors, and how to create and initialize them. Following this, we discussed how variables can be used to create intermediate and terminal outputs.

### TensorFlow operations

Finally, we discussed several available TensorFlow operations, including mathematical operations, matrix operations, and neural network-related operations that we’ll use later.

Get hands-on with 1200+ tech skills courses.