What Is This Course About?
Get a brief overview and the prerequisites for this course.
We'll cover the following
In applied machine learning—given a dataset—the first step is constructing an objective function to measure the error between the predicted and groundtruth scores. Once we have the objective function, the second step involves using an optimization algorithm to find a good set of parameters that minimize the objective function.
Whenever our model trains on the GPU, there is an optimization algorithm in action behind the scenes. Therefore, it is important to understand optimization to use machine learning to its full potential. Gradient descent is one of the famous algorithms used to optimize machine learning models.
In this course, we will cover optimization for machine learning. We will start with the fundamentals of optimization, such as gradients, Hessians, and convexity, and dive into some of the famous and widely used optimization algorithms, such as maximaminima, gradient descent, duality, secondorder methods, etc. We will also cover several applications of optimization, such as regression, regularized regression, maximum likelihood estimation (MLE), etc. The handson project at the end of the course will be based on the NumPy and SciPy libraries.
Who is this course for?
This course is for you if you are in the industry or academia looking to advance your knowledge in mathematics and learn how optimization plays a vital role in machine learning.
Prerequisites
This course requires learners to have a basic understanding of the following topics:

Familiarity with the Python language.

Familiarity with the NumPy, SciPy, and Matplotlib libraries.

Basic knowledge of functions, machine learning algorithms, matrices, vectors, linear algebra, probability, etc.
Learning outcomes
After taking this course, you will have learned the following concepts:
The importance and need of optimization algorithms in machine learning
The various types of optimization problems that exist and the algorithms used to solve them
The fundamentals of optimization, such as convexity, gradients, Hessians, etc
The implementation of popular optimization algorithms, such as gradient descent, maxima and minima, primaldual theorem, Newtonian methods, genetic algorithms, etc
Optimization in popular machine learning algorithms, like linear regression, regularized regression, MLE, etc