Evolving Neural Networks

Learn how genetic algorithms are being used in designing neural networks.

Genetic algorithms and neural networks

As you learned in the previous lesson, advancements in deep learning have fueled unprecedented advances in AI. You’ve seen instances throughout this chapter where genetic algorithms are a viable alternative to neural networks; however, genetic algorithms are also a viable tool for designing neural networks.

Remember, in the lesson Understanding Hyperparameters, when you learned about hyperparameters; you found out that hyperparameters are your chosen settings, such as selection rate, crossover rate, and mutation rate, and not the parameters your algorithm learns.

When designing neural networks, you can choose from a number of hyperparameters, such as how many neurons are in each layer or what the learning rate of an optimizer is.

Hyperparameter optimization

Genetic algorithms are a great choice for hyperparameter optimization. Hyperparameter optimization in the context of neural networks is the practice of maximizing the performance of a neural network by tweaking the combination of hyperparameters. As you’ve seen in this course, genetic algorithms work well for optimization tasks. You can use genetic algorithms to intelligently search through a set of hyperparameters for the best combination of hyperparameters. The process of hyperparameter optimization can be long and tedious to perform by hand, so genetic algorithms are a smart choice, as they automate the process and are proven to work effectively with optimization problems.

Neuroevolution

Another application of genetic algorithms to neural networks in the field of neuroevolution. Neuroevolution is different from hyperparameter optimization in that it involves evolving not only the hyperparameters used in the neural network but also the weights and structure of the network. The NEAT algorithm is an example of this use case, which is an algorithm for evolving a neural network that uses genetic algorithms.

Note: “NEAT” stands for NeuroEvolution of Augmented Topologies.

Compared to more traditional deep learning approaches, neuroevolution is relatively understudied. Uber proved in 2018 that genetic algorithms could significantly reduce the training time of neural networks.

One of the most significant works on genetic algorithms on the BEAM is Gene Sher’s Handbook of Neuroevolution Through Erlang. Sher firmly believes the BEAM is the best platform for the development of neuroevolutionary algorithms because the interaction of processes so closely mirrors the interaction of neurons in the brain.

Get hands-on with 1200+ tech skills courses.