Exercises
Test what we’ve learned about optimization algorithms, SciPy, vectors, and matrices.
We'll cover the following...
Exercise 1: Extending binary search
We learned how to generalize gradient descent and Newton’s method to deal with more variables, but what about binary search? Well, let’s discover it ourselves. Adapt the binary search method we saw in the first lesson of this section to solve a problem with two variables. Then, solve the following problem:
Excercise 2: Maximizing
We stated that gradient descent and Newton’s method can be easily adapted to solve maximization problems. But what are the restrictions that the problem should fulfill so we can maximize it with these algorithms? Change the methods and solve the following problem with both of them:
Exercise 3: Implementing linear regression
Linear regression is one of the most popular machine learning algorithms. Given a function and a set of inputs and outputs of that function, we want to approximate it with a linear function as accurately as we can.
We call the original function and the approximation we’re looking for. The input is a vector ...