Keras dense layer
Keras stands out as a well-known high-level deep-learning library, offering a user-friendly interface to construct and train neural networks effectively. One of Keras's most commonly used layers is the Dense layer, which creates fully connected neural networks.
This Answer will explore Dense layers, their syntax, and parameters and provide examples with codes.
What are dense layers?
Dense layers are fundamental building blocks in neural networks. They consist of a set of neurons, each connecting to every neuron in the previous layer. The term "dense" refers to how each neuron is densely connected to all neurons in the previous layer.
Note: To learn more about Keras input layers, refer to this answer.
Syntax
Keras provides a simple way to create dense layers using the Dense class. Let's examine the syntax required to define a dense layer in Keras:
from tensorflow import kerasdense_layer = keras.layers.Dense(units, activation=None, use_bias=True, ...)
Parameters
The following are the most commonly used parameters in the Dense layer.
units: This parameter specifies the number of neurons in the layer. It is a required parameter and must be a positive integer.activation: This parameter specifies the activation function to be applied to the layer's output. IfNoneis specified, no activation is applied.use_bias: This parameter specifies whether to include a in the layer. The default value isbias vector In a neural network, the bias vector is an additional parameter added to each layer that allows the network to learn an intercept or offset value. True.kernel_initializer: This parameter specifies the initialization method for the . The default isweight matrix The weight matrix in a neural network refers to the set of learnable parameters that determine the strength and importance of the connections between neurons in a given layer. glorot_uniform.bias_initializer: This parameter specifies the initialization method for the . The default is 'zeros'.bias vector In a neural network, the bias vector is an additional parameter added to each layer that allows the network to learn an intercept or offset value. kernel_regularizer: This parameter specifies the method for the weight matrix.regularization Regularization in a neural network is a technique used to prevent overfitting by adding a penalty term to the loss function. bias_regularizer: This parameter specifies the method for the bias vector.regularization Regularization in a neural network is a technique used to prevent overfitting by adding a penalty term to the loss function. activity_regularizer: This parameter specifies the method for the output of the layer.regularization Regularization in a neural network is a technique used to prevent overfitting by adding a penalty term to the loss function. kernel_constraint: This parameter specifies the constraint on the .weight matrix The weight matrix in a neural network refers to the set of learnable parameters that determine the strength and importance of the connections between neurons in a given layer. bias_constraint: This parameter specifies the constraint on the .bias vector In a neural network, the bias vector is an additional parameter added to each layer that allows the network to learn an intercept or offset value.
Example 1: Single dense layer
Here's a simple code example to build a single dense layer:
import tensorflow as tffrom tensorflow import kerasfrom tensorflow.keras import layers# Create a single dense layer with 32 units and sigmoid activationinput_dimension = 5dense_layer = layers.Dense(32, activation='sigmoid', input_shape=(input_dimension,))# Test with random input datainput_data = tf.random.normal((1, input_dimension))output = dense_layer(input_data)print(output)
Code explanation
Line 1: Import the TensorFlow library as
tf.Line 2: Import the
kerasmodule from TensorFlow.Line 3: Import the
layersmodule from TensorFlow’s Keras API.Line 6: Define the dimensionality of the input data as
input_dimension = 5.Line 7: Create a single dense layer with 32 units and the
sigmoidactivation using theDenseclass from thelayersmodule. Theinput_shapeparameter specifies the shape of the input data.Line 10: Generate random input data using TensorFlow’s
random.normalfunction. The shape of the input data is(1, input_dimension).Line 11: Pass the input data through the dense layer by calling the
dense_layerobject a function. This applies the layer’s transformation to the input data.Line 12: Print the dense layer
output.
Example 2: Multi-layer neural network
Here's a simple code example to build a multi-layer neural network:
import tensorflow as tffrom tensorflow import kerasfrom tensorflow.keras import layersinput_dimension = 5model = keras.models.Sequential([layers.Dense(64, activation='relu', input_shape=(input_dimension,)),layers.Dense(10, activation='sigmoid'),layers.Dense(128, activation='softmax', use_bias = False)])# Test with random input datainput_data = tf.random.normal((1, input_dimension))output = model(input_data)print(output)
Code explanation
Line 1: Import the TensorFlow library as
tf.Line 2: Import the
kerasmodule from TensorFlow.Line 3: Import the
layersmodule from TensorFlow’s Keras API.Line 5: Define the dimensionality of the input data as
input_dimension = 5.Line 6: The
Sequentialclass fromkeras.modelsis used to create the sequential model.Lines 7–10: Add three dense layers to the model:
The first dense layer has
64units and thereluactivation function.The second dense layer has
10units and thesigmoidactivation function.The third dense layer has
128units and thesoftmaxactivation function. Additionally, it does not use a bias term.
Line 13: Generate random input data using TensorFlow’s
random.normalfunction. The shape of the input data is(1, input_dimension).Line 14: Pass the input data through the model by calling the
modelobject a function. This performs the forward pass of the model, generating the output predictions.Line 15: Print the output of the model.
Conclusion
In this Answer, we explored the concept of dense layers in Keras, which play a crucial role in neural networks by capturing complex patterns and relationships in data. With the ability to configure the number of units, activation functions, and other parameters, dense layers provide flexibility and power in building deep learning models for various tasks.
Quick Quiz!
Which parameter in the Keras Dense layer defines the number of neurons in that layer?
activation
input_shape
units
use_bias
Free Resources