Summary: Recurrent Neural Networks
Review what we've learned in this chapter.
We'll cover the following...
We'll cover the following...
In this chapter, we looked at RNNs, which are different from conventional feed-forward neural networks and more powerful in terms of solving temporal tasks. Specifically, we discussed how to arrive at an RNN from a feed-forward neural network-type structure. We assumed a sequence of inputs and outputs and designed a computational graph that can represent the sequence of inputs and outputs.
This computational graph resulted in a series of copies of functions that we applied to each individual input-output tuple in the sequence. Then, by generalizing this model to any given single time step