Search⌘ K
AI Features

Neural Network Building Blocks

Explore the fundamental building blocks of neural networks, including linear layers and common activation functions such as sigmoid, ReLU, and leaky ReLU. Understand how dropout helps reduce overfitting and how layers process data batches using PyTorch. This lesson builds foundational knowledge for constructing and training neural networks effectively.

Neural networks form a class of machine learning objects that implement a parameterized function composition. Tensors flow in a neural network by successive transformations called layers.

A neural network is a composition of functions
A neural network is a composition of functions

In this lesson, we’ll get familiarized with some of the most common types of layers:

  • The linear layer

  • The logistic, a.k.a the sigmoid function

  • The ReLU function

  • The leaky ReLU function

  • The dropout layer

The linear layer

The linear layer is the star of neural networks. Almost every time we want to project a one-dimensional tensor, a.k.a. a vector, into another one-dimensional tensor, a linear layer will get involved.

A linear layer performs an affine transformation on a vector:

Here, yy is the output vector, WW is the weight vector, bb is the bias vector, and xx ...