Understanding Hyperparameters
Get to know what hyperparameters are in Elixirs and their roles in the definition of the genetic algorithm framework.
We'll cover the following
Overview of hyperparameters
In machine learning, hyperparameters refer to the parts of the algorithm we set before the algorithm starts training. Internally, the algorithm learns parameters that help it perform a task. Externally, the programmer controls parameters that dictate how the algorithm trains.
In the context of genetic algorithms, hyperparameters refer to things we choose before running the algorithm, like population size, mutation rate, and so on.
Because hyperparameters can have a huge impact on the outcome of our algorithms, it’s important to be able to rapidly change them. To ensure we can change hyperparameters without too much of a headache, we need to implement a simple configuration mechanism into the framework that separates the hyperparameters from the overall structure of the algorithm.
Adding hyperparameters to our framework
To start, change the signature of both run/3
and evolve/4
to accept an additional parameter:
Get hands-on with 1200+ tech skills courses.